This BCI Keeps Going and Going and Going
A novel AI algorithm enables brain-computer interfaces to adapt to the natural variability of the brain, allowing for long-term use.
An emerging technology that holds tremendous promise for both improving productivity and assisting with overcoming physical limitations is the brain-computer interface (BCI). These devices can enable communication at the speed of thought, restore the function of missing limbs, and much more. But the promise of this technology has yet to be fully realized due to gaps in our understanding of the brain, and other factors that limit the practical application of today’s BCIs to real-world problems.
Further complicating matters is that, even when a workable solution has been developed, it tends to stop working in just a day or two. However, researchers at the University of California San Francisco have recently made a breakthrough that could change this. In the course of their work, a man who is paralyzed was given the ability to control a robotic arm using a BCI that functioned for a record seven months without needing significant recalibration.
One of the biggest challenges in BCI technology has been the brain’s natural variability. Neural activity patterns shift from day to day, meaning that a system trained to recognize a user’s intentions on one day may not work as well the next. The team found that while the general shape of brain activity representations remained stable, their exact locations changed slightly over time.
To address this problem, the team developed an artificial intelligence (AI) model capable of adjusting for these shifts. By adapting to small changes in the brain’s neural patterns over time, long-term use of the system was made possible.
The participant involved in the study, who had been paralyzed for years due to a stroke, had tiny sensors implanted on the surface of his brain. These sensors detected the neural activity associated with imagined movements. Initially, he practiced imagining moving different parts of his body, such as his hands, feet, and head. The AI then learned to recognize the corresponding brain activity and translate it into commands for a robotic arm.
At first, his control over the robotic arm was imprecise. To refine his skills, he practiced with a virtual version of the arm, which provided feedback on how well his imagined movements matched the intended actions. Over time, he improved to the point where he could successfully manipulate real-world objects with the robotic arm.
Once the participant transitioned to using the physical robotic arm, he was able to perform complex tasks, such as picking up blocks, moving objects to different locations, and even retrieving a cup from a cabinet and filling it with water. What sets this work apart from past efforts is that months later, he could still control the robotic arm after a brief 15-minute recalibration session.
At present, the team is working on refining the AI models to make the robotic arm’s movements faster and smoother. They also plan to test the BCI in home environments to assess its practical applications for everyday use.
For individuals with paralysis, this work could be life-changing. The ability to feed oneself or get a drink of water without assistance would provide a level of independence that was previously impossible. And now, it will be more than just a brief demonstration for a publication — the technology will finally be able to make a real impact in people’s everyday lives.
R&D, creativity, and building the next big thing you never knew you wanted are my specialties.