This Wearable Gesture Sensor Packs AI Smarts to Strip the Noise From the Signal, Even at Sea
This soft circuit on a cloth armband can pick up natural gesture signals for robot control — while running, driving, and even diving.
Researchers from the University of California San Diego have developed a wearable device that, they say, can let people control robots and other machines using simple gestures — even while driving a car, running, or piloting a boat on rough seas.
"This work establishes a new method for noise tolerance in wearable sensors," says co-first author Xiangjun Chen, a researcher in the UC San Diego Jacobs School of Engineering. "It paves the way for next-generation wearable systems that are not only stretchable and wireless, but also capable of learning from complex environments and individual users. This advancement brings us closer to intuitive and robust human-machine interfaces that can be deployed in daily life."
Gesture sensors aren't a new concept, of course, and there are plenty of existing systems for turning a flick of the wrist into control instructions for a robot — but they all have the same problem: they work great when the user is sitting down and stationary, but when they're mobile the "signal" of the gesture is hard to pick out from the "noise" of unrelated movement.
That's where the team's work comes in: a simple soft electronic patch, powered by a flexible battery, attached to a cloth armband tracks motion and muscle impulses and transmits the data to a nearby system over Bluetooth — where it's processed by a custom deep-learning framework to remove the "noise" and isolate the gesture "signal." As a result, it works in a variety of environments where traditional wearable gesture sensors would fail: running, driving, while experiencing high-frequency vibrations and other disturbances, and even during a simulated ocean ride.
The team proposes the sensor as a way for patients undergoing rehabilitation or individuals with limited mobility to accurately control robotic aids using natural gestures that don't rely on precise fine-motor control, and for industrial workers and first responders to control assistive robots hands-free in high-motion and otherwise hazardous environments. The researchers even propose a variant which could allow divers to remotely control underwater robots, even in rough seas.
"By integrating AI [Artificial Intelligence] to clean noisy sensor data in real time," Chen says, "the technology enables everyday gestures to reliably control machines even in highly dynamic environments."
The team's work has been published in the journal Nature Sensors under open-access terms.
Main article image courtesy of David Baillot/UC San Diego Jacobs School of Engineering.