ATR Researchers Design Mind-Powered Robotic Third Arm

Mind-controlled prosthetics have been around for a while now, and most were designed to give those with missing limbs the ability to…

CabeAtwell
over 5 years ago Robotics

Mind-controlled prosthetics have been around for a while now, and most were designed to give those with missing limbs the ability to perform everyday tasks with the use of robotic appendages. Now imagine having an extra robotic arm that could perform separate tasks just by thinking about them while your human arms go about doing something else.

ATR’s Robotic Third Arm translates electrical impulses in the brain associated with specific activities to control the arm. (📷: ATR)

That’s the idea behind ATR’s (Advanced Telecommunications Research Institute International) robotic third arm, which is controlled through a brain-machine interface (BMI). Researchers demonstrated how people could be taught how to use the appendage just by thinking about it or rather thinking about multitasking several things at once.

While the arm isn’t advanced enough for juggling or typing-out more words-per-minute, it’s good for basic movements, such as drinking from a water bottle or grabbing objects and moving them from one area to another. That being said, it’s an exciting look at the advancement of BMIs to control robotic limbs.

ATR designed their platform using algorithms that read the electrical activity of the brain through electrodes worn on the head. More accurately, it reads the activity of the brain that is associated with performing different actions. For example, if you think about picking up a glass of water, individual neurons are fired in the brain that creates a biological pattern that’s unique to that task.

The algorithms then interpret those electrical patterns from others happening at the same time and direct the robotic arm accordingly based on those thoughts. The researchers tested the system using 15 volunteers, which had their brains monitored while multitasking- essentially sitting in a chair while balancing a ball on a board.

Electrical patterns generated while multitasking translates over to the arm, allowing you to perform two tasks at the same time. (📷: ATR)

This was done without using the robotic arm to gain pattern recognition for each of the volunteers. Once the process was complete, they then performed a different task, this time visualizing the robotic arm grabbing a nearby water bottle. The participants were then instructed to imagine performing both functions at the same time while hooked-up to the arm, which was tasked at grabbing the bottle and was successful at doing so 52% to 85% of the time (some are better at multitasking than others).

The researchers understand that their platform has practical applications for those who have lost a limb or those with limited mobility, but the applications for those without disabilities are not yet apparent. Still, it will be interesting to see how the technology will evolve in the near future.

Latest Articles