A New Thought Leader in BCI Technology

Researchers improved noninvasive BCIs with AI, outperforming traditional systems and enabling more practical device control with the mind.

Nick Bild
16 days ago β€’ Machine Learning & AI
Using a noninvasive BCI to control a computer (πŸ“·: D. Forenzo et al.)

With some high-profile projects underway at present, brain-computer interfaces (BCIs) are increasingly coming into the spotlight. BCIs are systems that enable direct communication between the human brain and external devices. These interfaces work by capturing brain signals via some type of sensing equipment and translating them into commands that can be understood by computers or other machines. BCIs have the potential to transform the way humans interact with technology and provide a bridge between the mind and the digital world.

For individuals with certain disabilities, such as paralysis, BCIs can offer transformative opportunities. These interfaces can help restore communication, mobility, and control for those who have lost the ability to move or speak. For instance, people with locked-in syndrome or severe motor impairments can use BCIs to control prosthetic limbs, wheelchairs, or even digital devices like computers and smartphones.

Beyond aiding those with disabilities, BCIs hold promise for enhancing the experiences of individuals without disabilities. For example, BCIs could be used to control smart home devices, allowing people to manage their environments using only their thoughts. In gaming and virtual reality, BCIs could offer immersive experiences by providing players with the ability to control game elements using their minds. In professional settings, BCIs could streamline tasks, improve productivity, and enhance creativity by allowing users to interact with digital interfaces more intuitively and efficiently.

But at present, the most effective BCIs require electrodes to be implanted directly into the brain. Needless to say, this greatly limits the applications for which they can be used. Noninvasive options also exist, however they do not perform well enough for them to be adopted widely. But this could all change in the near future, thanks to the work of a team at Carnegie Mellon University. They have built a system that can accurately interpret brain signals captured from a noninvasive electroencephalogram (EEG) headset.

In order to improve the performance of EEG decoders, the researchers designed and built a pair of deep learning-based algorithms. Similar approaches have shown limited success in the past because these algorithms require a very large amount of data for training, and this sort of data is difficult and time-consuming to collect. So first, the team created an automated system to label EEG data, eliminating the need for manual annotation. This makes it much easier to collect a large dataset for training supervised machine learning models.

This data was used to train deep learning models based on the EEGNet and PointNet architectures. A series of experiments were then conducted in which the performance of these models was compared with a traditional autoregression-based EEG decoder. The findings revealed that the deep learning-based decoders improved as the volume of training data increased, and they outperformed the traditional decoder by the final session. Both EEGNet and PointNet performed similarly, and while traditional decoders maintained consistent performance, they did not improve as subjects gained experience.

This study demonstrated the potential of deep learning-based decoders to enhance the performance of EEG decoding systems, which could open the door to more advanced BCI applications. The work also suggests that future work should focus on improving training methods and exploring other deep learning architectures to make noninvasive BCIs more accurate and practical.

Nick Bild
R&D, creativity, and building the next big thing you never knew you wanted are my specialties.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles