TinyML Gives an Arduino Nano 33 BLE Sense Wristband High-Speed, Low-Power Gesture Recognition

Built using low-cost parts, this prototype wristband doesn't need a high-power radio and nearby computer to run a gesture-detecting CNN.

A pair of scientists from the German Research Center for Artificial Intelligence have unveiled a system for using on-device machine learning β€” tinyML β€” to add local gesture recognition to a wrist-worn microcontroller.

"Although hand gesture recognition has been widely explored with sensing modalities like IMU, electromyography and camera, it is still a challenge of those modalities to provide a compact, power-efficient on-board inferencing solution," the scientists, Sizhen Bian and Paul Lukowicz, explain. "In this work, we present a capacitive-sensing wristband surrounded by four single-end electrodes for on-board hand gesture recognition."

To prove the concept, the pair built a prototype wearable that connects a Texas Instruments FDC2214 capacitive sensor to an Arduino Nano 33 BLE Sense, which includes a 64MHz Arm Cortex-M4 processor with floating-point coprocessor, 256kB of RAM, and 1MB of flash.

Despite these relatively modest specifications, the scientists were able to squeeze a convolutional neural network (CNN) onto the device β€” allowing it to process data entirely locally, rather than having to send it out to a more powerful system over Bluetooth. The result: Private, surprisingly high-performance gesture recognition at a minimal power draw.

"After converting and interpreting the model with TensorFlow Lite, we got an on-board model with a size of 29.6kbytes (float32 mode without quantization)," the pair explain. "The on-board inference time is around 12ms. The consumed power for the on-board gesture recognition is 26.4mW (8mA x 3.3V) without optimizing (currently we are giving full speed and full current to the sensing module, the power can be significantly optimized by supplying each channel with a threshold current while keeping the sensing sensitivity.)"

The prototype proved able to recognize seven independent gestures β€” down, up, left, right, stretch, fist, and idle β€” but the pair warn work is still to be done: "The future work will be firstly collecting more data from multiple users to verify the generalization of the on-board sensing modality," they conclude. "Secondly, the power efficiency will be focused on to further decrease the working current to the level of Β΅A. A few practical use cases (like smart home control, augmented reality interaction) will be carried out to verify the long-term reliability of the prototype in real environment."

The team's work has been published in the Adjunct Proceedings of the 2021 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2021 ACM International Symposium on Wearable Computers, and is available on the ACM Digital Library under closed-access terms.

Gareth Halfacree
Freelance journalist, technical author, hacker, tinkerer, erstwhile sysadmin. For hire: freelance@halfacree.co.uk.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles