Bringing Interactivity to Soft Devices with E-Textile Microinteractions

Not your grandma's sewing kit! This project out of Google Research puts thread to work in an e-textile helical sensing matrix.

Taylor Tabb
5 years ago

At first glance, a thread is a pretty unassuming form, but just like logic gates in hardware systems, threads are the fundamental building block of textile systems. In the last few years, a number of research projects have expanded the capabilities of textiles — sometimes in more traditional ways, like odor removal or temperature responsiveness, but other times in more radical and experimental ways. One of these more experimental ways came in the form of a 2018 UIST paper from Google Interaction Lab, I/O Braid: Scalable Touch-Sensitive Lighted Cords Using Spiraling, Repeating Sensing Textiles and Fiber Optics. Which proposed an interactive textile cord that could sense proximity, touch, and twist— and could give visual feedback through fiber optic strands. While the sensing architecture of I/O Braid is is extremely impressive, the primary contribution of the initial paper was the method, and less so the applications or design space. Fast-forward to 2020, and Google Researchers have dived deeper into I/O Braid, showing us what types of interactions are possible, and what a future of ubiquitous e-textile driven computing might look like.

The new paper, E-textile Microinteractions: Augmenting Twist with Flick, Slide and Grasp Gestures for Soft Electronics, overviews six different types of controls that the cord based interfaces allow — Twisting, Flicking, Sliding, Pinching, Grabbing, and Patting, grouping those into two gesture types: discrete (Twist) and continuous (Flicking, Sliding, Pinching, Grabbing, and Patting). The most exciting parts of the paper are the applications the team has developed from combining the discrete and continuous type gestures, and the machine learning-driven real-time gesture information pipeline that interprets each action.

The foundation of it all is the Helical Sensing Matrix (HSM). This is what they call the array of different types of fibers (some conductive, some fiber optic, and some just as passive structural elements) that are braided into the cord. Pairs of conductive fibers that are along the same diameter line (180º from each other) share an electrode — this arrangement allows the identification of the relative motion of pinching and rolling across the pairs. In combination with monitoring the phase changes in signals sent through the fibers, this system enables all of the gesture types to be detected. Then, optionally, the fiber optic strands can provide feedback though colored light.

To study and categorize the gestures, the team gathered 864 samples from 12 participants, and then captured “feature vectors” to illustrate each gesture numerically. They used a Python machine learning toolkit for classification, which resulted in the spectacular GIF below, depicting the feature vectors for each participant and gesture.

Finally, the team made some prototype scenarios for the gesture-cord combos, both enabling control of an independent hardware device. The first are e-textile USB-C headphones, which allow playback control from just tapping (pause/play), double tapping (next track) and rolling (volume up/down). The second is an interactive speaker cable for a smart device, offering similar gesture control of music.

The team explains future directions also — primarily that they hope their work motivates similar gestural microinteractions in future smart fabrics and wearables. And it certainly does inspire thought about what other physical objects could benefit from e-textile technology.

The paper can be found in the Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, and a write-up from the team can be found on the Google AI Blog.

Taylor Tabb
Engineer. Maker. Design Generalist 😃
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles