Background: Brain-machine interfaces (BMIs) allow direct translation of electric, magnetic or metabolic brain signals into control commands of external devices such as robots, prostheses or exoskeletons. However, non-stationarity of brain signals and susceptibility to biological or environmental artifacts impede reliable control and safety of BMIs, particularly in daily life environments. Here we introduce and tested a novel hybrid brain-neural computer interaction (BNCI) system fusing electroencephalography (EEG) and electrooculography (EOG) to enhance reliability and safety of continuous hand exoskeleton-driven grasping motions.
Findings: 12 healthy volunteers (8 male, mean age 28.1 ± 3.63y) used EEG (condition #1) and hybrid EEG/EOG (condition #2) signals to control a hand exoskeleton. Motor imagery-related brain activity was translated into exoskeleton-driven hand closing motions. Unintended motions could be interrupted by eye movement-related EOG signals. In order to evaluate BNCI control and safety, participants were instructed to follow a visual cue indicating either to move or not to move the hand exoskeleton in a random order. Movements exceeding 25% of a full grasping motion when the device was not supposed to be moved were defined as safety violation. While participants reached comparable control under both conditions, safety was frequently violated under condition #1 (EEG), but not under condition #2 (EEG/EOG).
Conclusion: EEG/EOG biosignal fusion can substantially enhance safety of assistive BNCI systems improving their applicability in daily life environments.