When we use our mobile devices and computers, most of us rely almost entirely on visual communication. Texts, emails, and tweets are all displayed as intangible messages on the screens around us. But that doesn’t do much good if you’re in a situation where you can’t see well, or if you’re visually impaired. Text-to-voice can help, but it fails in environments that are noisy. This new haptic technology developed by researchers from Rice University’s Mechatronics and Haptic Interfaces Laboratory can communicate entirely through touch.
The team, led by Marcia O’Malley, saw the need for a method of communication that would still work in emergency situations when it’s difficult to see and hear. The technology also has many other potential applications. It could, for instance, allow the blind to read electronic messages. While braille is useful for reading static text, it hasn’t translated well to dynamic electronic messages. This wearable technology could solve both problems by giving users the ability to feel words.
The device itself, called MISSIVE (Multi-sensory Interface of Stretch, Squeeze, and Integrated Vibrotactile Elements), is an armband with multiple built-in actuators that press against the skin. It’s able to squeeze, stretch, and vibrate against the wearer’s skin in order to communicate messages. During testing, they showed that users could learn to interpret most of the words presented by MISSIVE in just two hours. Phenomes — the vocal sounds that make up words — make that possible. Using just 23 of the most common English phenomes encoded as haptic feedback, users can decipher complex messages without ever looking at a screen.