I Know Just How You Feel
This robotic haptic display replicates complex tactile sensations with separate mechanisms to produce cutaneous and kinesthetic feedback.
Haptic displays, also known as tactile displays or touch feedback devices, are technologies designed to provide users with tactile sensations or feedback through the sense of touch. Unlike visual or auditory displays, haptic displays engage the user's sense of touch to convey information or enhance the overall user experience. These displays can simulate various textures, forces, vibrations, or even shapes, allowing users to interact with digital or virtual environments in a more immersive and realistic manner.
One of the key applications of haptic displays is in virtual reality (VR) and augmented reality (AR) systems. By incorporating haptic feedback into VR/AR environments, users can experience a greater sense of presence and engagement. For example, in VR gaming, haptic displays can simulate the sensation of touching objects within the virtual world, making the experience more lifelike and interactive. In medical training simulations, haptic feedback can replicate the sensation of performing surgical procedures, providing valuable tactile cues to trainees.
Existing haptic technologies range from simple vibration motors found in smartphones and game controllers to more sophisticated systems incorporating actuators, sensors, and advanced algorithms. Some haptic displays use piezoelectric materials or electromagnetic actuators to generate tactile sensations, while others utilize pneumatic or hydraulic systems to simulate forces and textures. Despite advancements in haptic technology, current systems still face challenges in accurately replicating complex tactile sensations, such as softness perception.
One significant limitation is the inability to distinguish between cutaneous cues (surface texture) and kinesthetic cues (forces and pressures) in softness perception. This present gap may soon be filled by a robotic haptic interface designed by researchers at the Swiss Federal Institute of Technology Lausanne and Hanyang University. Called the Softness Rendering Interface (SORI), this device has separate elements to simulate cutaneous and kinesthetic cues, enabling the rendering of complex tactile sensations.
Building on previous studies, the team was aware that the perception of softness has a lot to do with how much skin comes into contact with the surface of an object. Accordingly, the component of SORI that a fingertip directly comes into contact with is a soft and flexible silicone membrane. This membrane can be made soft, hard, or anywhere in between by filling it with varying amounts of air, which determines how much of the fingertip comes into contact with it.
That innovation allows for complex cutaneous sensations to be conveyed, but a second system was needed to render kinesthetic cues simultaneously. This consists of origami joints that are actuated by a motor. By applying more or less force, the amount of feedback given when pressed upon will vary. A feeling of stiffness or softness results.
Mustafa Mete, the lead researcher involved in this work, explained the need for both types of feedback in noting that βif you press on a marshmallow with your fingertip, it's easy to tell that it's soft. But if you place a hard biscuit on top of that marshmallow and press again, you can still tell that the soft marshmallow is underneath, even though your fingertip is touching a hard surface.β
It is this type of complex sensation that SORI was designed to reproduce. By offering a new level of realism in the reproduction of tactile sensations, this system could enable many new, and more immersive, digital experiences in the future.