Sound2Haptic Brings Advanced Touch Feedback to Everyday Devices
MIT's Sound2Haptic packs multi-channel haptic hardware into one compact device, making immersive VR experiences cheaper and more portable.
Virtual and augmented reality experiences are not going to be fully immersive until we can perfectly reproduce the sensation of touch. It would seem that we are still a long way away from that ultimate goal, but we have to get along the best we can until we can plug into the Matrix. For reasons of cost and practicality, most of today’s haptic interfaces use some type of vibrotactile system. These interfaces have limited resolution, but for certain applications they work well enough.
The main component of these systems is simple vibration motors, which keeps their cost and complexity low; but that is only part of the story. More complex (and better performing) vibrotactile interfaces have many motors, and these motors require a lot of supporting hardware, such as sound cards and haptic amplifiers, for operation. This hardware provides components like the multichannel digital-to-analog converters needed to create complex haptic feedback, but it also increases the bulk and cost of the systems to the point that they are impractical for most use cases.
There may be a better solution on the horizon, however. A group led by researchers at MIT has developed a system called Sound2Haptic that makes it easy to integrate vibration motors and all of the necessary supporting hardware for a multi-channel haptic interface into a single, compact device.
Sound2Haptic is built around a custom circuit board that uses a CMedia CM6206 USB-to-7.1 audio chip (the same kind found in many consumer-grade sound cards) to convert standard digital audio into eight synchronized output channels. Each of those channels passes through TPA6211A1 amplifier stages that provide headphone-level power suitable for driving small vibration motors. This approach allows the device to act just like a regular sound card, but one that outputs tactile sensations instead of sound.
To make setup and calibration easier, the team added a current-sensing system based on the INA180 amplifier. This circuit can measure the electrical load of each actuator to determine how tightly it is pressed against the skin, which is a key factor in how vibrations are perceived. Unlike more complicated back-electromotive-force sensing systems, this current-based approach is simpler and avoids the need for switching circuitry, keeping the design lightweight and reliable.
The device’s actuators are mounted in modular motor rings that can be rearranged and customized for different body locations. Each motor sits inside an inner ring suspended within an outer frame by fine thread, allowing horizontal movement while minimizing unwanted vibration spread. This design makes the device both quiet and comfortable, and it prevents one actuator’s motion from interfering with its neighbors.
To validate their design, the team conducted a psychophysical study with participants aged 20 to 97, testing perception thresholds and spatial localization across three different prototype devices. Results showed that their necklace and over-ear designs provided superior spatial localization, while bracelet and necklace forms offered better vibration sensitivity.
By lowering both the cost and complexity of advanced haptic systems, Sound2Haptic could accelerate progress in tactile computing. It might not yet deliver the full sensation of a virtual handshake, but it is a meaningful step toward making immersive touch experiences accessible to everyone.