A pair of roboticists from the Japan Advanced Institute of Science and Technology (JAIST) have published a paper detailing a vision-based artificial skin for robots, which they claim offers a human-like sense of touch: TacLINK.
"The main challenge lies in mimicking the inherent complexity of natural skin structure that has a particularly high density of mechanoreceptors with specialized functions such as sensing pressure, vibrations, temperature, and pain," explains Associate Professor Van Anh Ho of the problem he and colleague Lac Van Duong sought to address. "All approaches so far have only focused on developing a skin-like structure with a matrix of different sensors without considering the bulk of wires, electronic components, and the risk of damage from frequent contact."
The solution: A high-performance vision-based sensing system which offers a considerably simpler structure, improving scalability and reliability while reducing manufacturing costs. Dubbed TacLINK, the novel skin proved capable of processing tactile information and determining contact force and geometry — just like a traditional sense of touch.
The prototype TacLINK uses an acrylic tube as a "bone" framework, covered in 500cm² of silicone rubber "skin". An array of makers are printed on the surface of the skin and used to track deformation via a pair of co-axial cameras configured as a stereo pair — an improvement in cost and robustness over embedded sensors, the pair claim.
"The artificial skin used in our study can be easily fabricated by the casting method and can, therefore, be implemented on other parts of robots, such as fingers, legs, chests, and heads, and even for smart prosthetics for humans, allowing a disabled person to perceive sensations the same way as a normal human," Ho claims. "In addition, it can also be used to design various sensory devices in medicine, healthcare, and industry. In fact, it is especially suited for the development of robotic systems in the post-COVID era to enable remote service with robotic avatars."
The work has been published in the journal IEEE Transactions on Robotics under open-access terms.