A Smart Ring with a Smart Design

The VibRing smart ring turns quick, low-effort finger gestures into precise device inputs using a tiny piezo sensor and an Arduino.

Nick Bild
3 months agoWearables
The VibRing hardware (📷: B. Li et al.)

Touchscreens and voice recognition systems may be among the most popular styles of user interfaces in portable electronics today, but they leave much to be desired for many applications. Touchscreens require our visual attention and the use of our fingers to enable interactions, while voice recognition systems perform poorly and compromise privacy in crowded public spaces. Gesture-based interfaces offer a lot of promise for applications where these factors are a concern.

Yet gesture recognition systems are still lightly used in commercial devices at present. Some of the main reasons are that the primary technologies used to detect gestures, like cameras and IMUs, have issues with power consumption, accuracy, or privacy-related concerns that render them impractical for real-world use in many cases.

In an effort to make subtle single-handed gestures more practical as an input mechanism, a group of researchers at the University of British Columbia has developed what they call VibRing. It is a lightweight wireless hand gesture sensing platform built around a single finger-worn vibroacoustic sensor. This sensing method sidesteps many of the problems with existing technologies, as it uses little power, is highly accurate, and protects the user’s privacy.

The VibRing hardware is built around a small piezoelectric (piezo) sensor that detects the tiny vibrations generated by subtle finger gestures. When worn on the finger and coupled to the skin, this passive sensor picks up contact-based vibrations, and since it ignores ambient sound, it is highly robust in real-world environments. In the team’s experiments, the sensor was sampled using either an Arduino Nano 33 BLE or an Adafruit Feather development board with an ESP32 microcontroller.

To determine the best gesture set for recognition, the researchers developed a gesture design framework that combines three finger regions (nail, pad, knuckle) with three action types (tap, flick, rub), leading to 27 possible gestures. After user testing for comfort and distinctiveness, 11 gestures were chosen for the final implementation.

In testing, the VibRing system achieved a gesture classification accuracy of 94.2% using a convolutional neural network. New users could reach 92.7% accuracy after collecting just 10 minutes of training data. In an office environment study simulating everyday conditions, the system maintained a true-positive rate of 90.9% while users mixed gestures in with normal tasks.

A series of experiments further validated the system’s design. The researchers compared the performance of the piezo sensor to an IMU in sensing short, vibration-rich signals by dropping a ping-pong ball onto an acrylic surface. The piezo sensor captured richer and more consistent signal data than the IMU, demonstrating its suitability for rapid gesture detection. These experiments also allowed them to fine-tune components such as the DC bias circuit and sensor mounting pressure to ensure optimal performance.

If the early results can be duplicated in larger trials, the VibRing platform might prove to be an effective way to bring quick, low-effort gestures to future wearable or mobile devices without compromising on comfort, accuracy, or power efficiency.

Nick Bild
R&D, creativity, and building the next big thing you never knew you wanted are my specialties.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles