Using a ‘Symbiotic’ Wearable to Experiment with Gesture Recognition

Embodied Companionship is a wearable research project that explores machine learning and human-data interaction.

hectoraisin
5 months ago Wearables

A symbiotic wearable sounds like the plot of a Marvel movie. But, that is exactly what Ling Tan and Despina Papadopoulous aim to achieve with their Embodied Companionship project, which uses a hand-woven fabric with conductive yarns to experiment with gesture recognition and human-data interaction.

A prototype of The Little Creature

The fabric, nicknamed "The Little Creature," is connected to an Arduino Nano 33 BLE board running a machine learning model trained using TensorFlow Lite. The Arduino board is the “classic” gesture recognition device, with multiple sensors including the APDS9960 for gesture, proximity, light, and color detection. It detects gestures and signals different parts of The Little Creature to heat up, based on the recognized gesture.

The fabric produces a cadence of heat, ranging from a murmur to a burning, almost aggressive sensation on the skin. The warmth mimics the sense of touch, and the Little Creature appears alive.

Stylefree scarf by Ling Tan and Despina Papadopoulos

Embodied Companionship builds on an earlier wearable companion, Stylefree, a scarf that uses breathing and heating patterns to become more alive as the wearer interacts with it. With Embodied Companionship, Tan says they wanted to explore the cybernetic relationship between “technology, the wearable (medium), and its wearer.” Using second-order feedback loops and machine learning, the project aims to embody “machine learning processes and shed some light on the algorithmic black box.”

Tan has also worked on SUPERGESTURES, a participatory performance in Manchester that used gesture-sensing wearables, and she identifies some challenges of working on wearable projects.

For one, gesture recognition is quite complex, much more so than voice recognition. Simple gestures like swipes are easy to identify, and we use them every day. However, subtle, slow gestures are harder to recognise since they produce low-intensity signals while the sensors “listen” for significant motion.

To classify inputs, gesture recognition systems try to match them with preset categories. A gesture detection system usually returns one of its learnt gestures, regardless of the input. However, the absence of gestures can be a gesture itself, depending on the context. This “negative space” is easily understood by people but presents a challenge for gesture recognition algorithms.

SUPERGESTURES (📷: Robin Hill)

Tan says the concept of recognising “nothing” gestures for wearable or body-based work has not been widely explored, as in other machine learning fields such as computer vision. She argues that adequate error-detection and filtering alone will not suffice, and "nothing" gestures have to be considered on a case-by-case basis.

There are also other nuances to consider. While some are universal, gestures tend to vary across cultures. A gesture can be cordial in some cultures and offensive in others. How can we train machines to understand the context and nuances of gestures in a tech landscape where computers cannot identify skin colors reliably?

Embodied Companionship is still a work in progress. It showcases the potential of gesture recognition for human-computer interaction and quantified self projects. While not a true symbiote yet, the Little Creature envisions a future where humans and technology interact in a dynamic, mutually beneficial way.

The Embodied Companionship page has more information about the project.

hectoraisin

Freelance writer specializing in hardware product reviews, comparisons, and explainers

Latest Articles