Title
MOABB: Trustworthy algorithm benchmarking for BCIs.
Abstract
Objective. Brain-computer interface (BCI) algorithm development has long been hampered by two major issues: small sample sets and a lack of reproducibility. We offer a solution to both of these problems via a software suite that streamlines both the issues of finding and preprocessing data in a reliable manner, as well as that of using a consistent interface for machine learning methods. Approach. By building on recent advances in software for signal analysis implemented in the MNE toolkit, and the unified framework for machine learning offered by the scikit-learn project, we offer a system that can improve BCI algorithm development. This system is fully open-source under the BSD licence and available at https://github.com/NeuroTechX/moabb. Main results. We analyze a set of state-of-the-art decoding algorithms across 12 open access datasets, including over 250 subjects. Our results show that even for the best methods, there are datasets which do not show significant improvements, and further that many previously validated methods do not generalize well outside the datasets they were tested on. Significance. Our analysis confirms that BCI algorithms validated on single datasets are not representative, highlighting the need for more robust validation in the machine learning for BCIs community.
Year
DOI
Venue
2018
10.1088/1741-2552/aadea0
JOURNAL OF NEURAL ENGINEERING
Keywords
Field
DocType
brain-computer interfacing,EEG,machine learning,BCI,spatial filtering,CSP,software
Signal processing,Computer science,Trustworthiness,Software suite,Brain–computer interface,Algorithm,Preprocessor,Software,Artificial intelligence,Decoding methods,Machine learning,Benchmarking
Journal
Volume
Issue
ISSN
15
6
1741-2560
Citations 
PageRank 
References 
5
0.39
11
Authors
2
Name
Order
Citations
PageRank
Vinay Jayaram1372.12
Alexandre Barachant21008.80