A Python-Driven Brain-Computer Interface Delivers Dexterous Control of Individual Robotic Fingers

Non-invasive EEG headsets fed into a deep learning model can deliver two- and three-finger control, a study has shown.

Researchers from Carnegie Mellon University's Departments of Biomedical Engineering, Electrical and Computing Engineering, and Neuroscience Institute have demonstrated a non-invasive brain-computer interface [BCI] sensitive enough to deliver per-finger control over a robotic hand.

"Improving hand function is a top priority for both impaired and able-bodied individuals, as even small gains can meaningfully enhance ability and quality of life," says corresponding author Bin He, professor of biomedical engineering at Carnegie Mellon University, of the team's work. "However, real-time decoding of dexterous individual finger movements using non-invasive brain signals has remained an elusive goal, largely due to the limited spatial resolution of EEG [Electroencephalography]."

Researchers have been able to drive a robotic hand at the level of individual fingers using a non-invasive EEG-based brain-computer interface. (πŸ“Ή: Carnegie Mellon University)

Traditionally, motorized prosthetics are controlled using the muscles in the wearer's arm β€” but there's a need for prosthetic control for those for whom muscle measurement, using electromyography (EMG), doesn't work. Dexterous control has already been demonstrating using invasive brain-computer interface implants, but non-invasive external interfaces based on electroencephalography have lacked the resolution for fine motor control β€” until now.

The team's work saw 21 able-bodied individuals, previously trained for around two hours in limb-level but not finger-level BCI operation, fitted with a non-invasive EEG sensor rig, data from which was decoded into movement execution and motor imagery (ME and MI) tasks β€” focusing on per-finger control, rather than a simple opening and closing of a fist. The results, driven by a fine-tuned version of the EEGNet-8,2 deep learning network, show promise: nearly 81 percent accuracy was achieved for tasks involving two fingers, dropping to just over 60.6 percent for tasks involving three fingers.

"The insights gained from this study hold immense potential to elevate the clinical relevance of non-invasive BCIs and enable applications across a broader population," He says of the research. "Our study highlights the transformative potential of EEG-based BCIs and their application beyond basic communication to intricate motor control."

"Despite the inherent challenges of individual finger movement decoding using non-invasive techniques," the researchers conclude, "the performance achieved in this study underscores the significant promise for developing more intricate and naturalistic non-invasive BCI systems. The successful demonstration of individual robotic finger control represents a substantial advancement in dexterous EEG-BCI systems and serves as a critical step forward, guiding future research in the field."

The team's work has been published in the journal Nature Communications under open-access terms; Python source code is available on GitHub under the permissive MIT license.

Gareth Halfacree
Freelance journalist, technical author, hacker, tinkerer, erstwhile sysadmin. For hire: freelance@halfacree.co.uk.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles