Not Just for Looks, This "RobotSweater" Is an Easy Way to Give Robots a Touch-Sensitive Skin

Made using a standard knitting machine, RobotSweaters can be used for touch-based sensing and even wearable control systems.

Gareth Halfacree
11 months ago β€’ Robotics / Sensors / Wearables

Researchers from Carnegie Mellon University and the University of California at Santa Barbara have come up with an easy way to give robots tactile "skin" β€” by knitting them "RobotSweaters" on industrial knitting machines.

"Knitting machines can pattern yarn into shapes that are non-flat, that can be curved or lumpy," explains James McCann, assistant professor at Carnegie Mellon's School of Computer Science and co-author of the paper detailing the RobotSweater concept. "That made us think maybe we could make sensors that fit over curved or lumpy robots."

Designed for more than just looking good, RobotSweaters give cosy robots a whole new sense of touch. (πŸ“Ή: Carnegie Mellon)

"We can use [the RobotSweater] to make the robot smarter during its interaction with humans," explains Changliu Liu, assistant professor of robotics at Carnegie Mellon's SCS. "With RobotSweater, the robot's whole body can be covered, so it can detect any possible collisions."

The RobotSweaters are surprisingly simple to manufacture, being based on just two layers of striped conductive yarn sandwiching a lace-like layer of insulation. When pressure is applied, the circuit between the upper and lower layers is completed β€” triggering an input in the robot.

"The force pushes together the rows and columns to close the connection," explains Wenzhen Yuan, assistant professor, director of Carnegie Mellon's RoboTouch lab, and corresponding author on the paper. "If there's a force through the conductive stripes, the layers would contact each other through the holes."

Connected to an Arduino Nano microcontroller via snap fasteners, the RobotSweater can detect not only when it's being touched by the distribution, shape, and force of any contact β€” with greater precision than vision-based rival approaches.

To prove the concept, the team equipped a pair of robots with the pressure-sensitive garments to provide touch-based control β€” moving a robot's head based on touch, and allowing a robot arm to be pushed out of the way or its gripper opened and closed based on the operator's grip.

The team's work is available on Cornell University's arXiv preprint archive under open-access terms.

Gareth Halfacree
Freelance journalist, technical author, hacker, tinkerer, erstwhile sysadmin. For hire: freelance@halfacree.co.uk.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles