Researchers Develop an AI "Co-Pilot" for Better Non-Invasive Brain-Computer Interfaces

EEG hats fed into an artificially intelligent assistant provide improved task completion for computer and robot operation.

Researchers from UCLA have come up with a way to boost task completion via computers or robot arms controlled using a non-invasive brain-computer interface (BCI) — by adding an artificially intelligent "co-pilot" to the mix.

"By using artificial intelligence to complement brain-computer interface systems, we're aiming for much less risky and invasive avenues," claims project lead Jonathan Kao, associate professor of electrical and computer engineering at the UCLA Samueli School of Engineering, of the team's work. "Ultimately, we want to develop AI-BCI systems that offer shared autonomy, allowing people with movement disorders, such as paralysis or ALS [Amyotrophic Lateral Sclerosis], to regain some independence for everyday tasks."

An AI "co-pilot" has been shown to improve task completion for non-invasive brain-computer interface users. (📹: UCLA)

The team's work centered around non-invasive brain-computer interfaces, which use external electrodes to capture brain activity via electroencephalography (EEG) — much less invasive than surgically-implanted systems, but also operating with lower performance. It's that latter issue that the team's approach attempts to work around, by introducing an artificial intelligence model to interpret the data and assist in completing the desired task.

To prove the concept, the team had four participants — one of whom was paralyzed from the waist down — don an EEG head cap and complete two key tasks by thought alone: moving a mouse cursor from target to target and to use a robotic arm to move four blocks. Without the AI in the loop, completion proved difficult; with the AI, the tasks were completed much more quickly — and the paralyzed participant was able to finish the robot task in around six and a half minutes, after failing to finish without assistance.

"Next steps for AI-BCI systems could include the development of more advanced co-pilots that move robotic arms with more speed and precision, and offer a deft touch that adapts to the object the user wants to grasp," suggests co-lead author Johannes Lee. "And adding in larger-scale training data could also help the AI collaborate on more complex tasks, as well as improve EEG decoding itself."

The team's work has been published in the journal Nature Machine Intelligence under closed-access terms; model training code has been published to GitHub under the permissive BSD three-clause license.

Gareth Halfacree
Freelance journalist, technical author, hacker, tinkerer, erstwhile sysadmin. For hire: freelance@halfacree.co.uk.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles