Researchers at the University of Waterloo are working on a combination of deep learning artificial intelligence (AI) and wearable camera hardware, which will, they hope, allow for the creation of self-walking robotic exoskeletons capable of "thinking and moving" on their own.
"We're giving robotic exoskeletons vision so they can control themselves," explains Brokoslaw Laschowski, a PhD candidate in systems design engineering who leads the University of Waterloo research project dubbed ExoNet, which aims to do away with manual control of powered exoskeletons.
"[Manual control] can be inconvenient and cognitively demanding. Every time you want to perform a new locomotor activity, you have to stop, take out your smartphone and select the desired mode."
The solution: A wearable camera system which feeds data to a deep learning AI platform capable of recognizing environmental features like stairs and doors, then — in the project's next phase — automatically work out how the motors should be controlled to navigate the environment by climbing stairs, avoiding obstacles, and so forth.
"Our control approach wouldn't necessarily require human thought," Laschowski claims. "Similar to autonomous cars that drive themselves, we're designing autonomous exoskeletons and prosthetic legs that walk for themselves."
The work is similar in concept to that published by researchers from North Carolina State University and the University of North Carolina at Chapel Hill last year, which focused on using a wearable camera connected to a Raspberry Pi single-board computer to improve the safety of lower-limb prostheses.
The latest paper in the ExoNet series, which deals with the simulation of biomechanics for sit-to-stand locomotion with energy regeneration, has been published under closed-access terms in the journal IEEE Transactions on Medical Robotics and Bionics for early access.