Lazy People, Rejoice: A Laundry-Folding Robot Was Created

Magnetic particles embedded in an artificial skin give an ML model the data it needs to guide a robotic arm in accurately grasping cloth.

Nick Bild
2 years ago β€’ Machine Learning & AI
Robotic arm equipped with ReSkin system picking up fabric (πŸ“·: S. Tirumala et al.)

Sometimes it seems like computer vision gets all the attention when it comes to input data for machine learning algorithms. Image sensors are the sensors that all the other sensors want to be. And understandably so, after all, visual information is very rich and has been leveraged to great effect in fields ranging from autonomous driving and medical diagnostics to assistive technologies. These successes can make it easy to forget how important other senses are as we go about our normal activities. Consider driving, for example β€” certainly it is important to see where you are going, but the feel of the road through the wheel also provides information about the condition of the road when it is raining or snowing. Similarly, getting dressed or folding laundry would be very difficult if it were not for the feel of the fabric in one’s hands.

On that last point, several methods have recently been proposed to help robots to manipulate cloth in various ways. However, these approaches have not been particularly effective at many common tasks, like grabbing a desired number of layers of cloth from a stack. To address these shortcomings, a team of researchers at The Robotics Institute at Carnegie Mellon University have developed a system called ReSkin that uses magnetometer-based sensors to give robots an accurate sense of touch, even when working with deformable materials like cloth. This is a deceptively difficult problem, owing to the flexibility of cloth and the unpredictable ways it can crumple up when interacted with.

The ReSkin artificial skin is a thin elastic polymer with magnetic particles embedded in it to allow for capturing of measurements in three axes. Through movements of, or depressions in, this simulated skin, one can interpret the changes that occur in the surrounding magnetic fields as a tactile signal. Because ReSkin sensors are very thin and do not involve imaging, they can fit into very small and dark areas to gather information that are inaccessible through other means, like the folds between layers of fabric.

This novel tactile sensor was installed on the finger of a mini-Delta gripper on a seven degree of freedom Franka robotic arm. The team wanted to teach this robot to perform tasks like folding laundry, textile manufacturing, or assistive dressing, so they started with an important first step β€” grabbing a specific number of pieces of fabric from a pile. Manually programming the rules to accomplish this task would be difficult, and it would not be able to adapt to new situations that had not been planned for in advance, so the team decided to use a machine learning classifier to recognize how many layers of cloth the robot is grasping.

After collecting a set of training data and teaching a k-means classification model, it was ready to recognize four distinct scenarios β€” pinching with no cloth between its fingers, or pinching with one, two, or three layers of cloth between its fingers. This information was then fed into a pipeline to alter the grasp policy of the robotic arm to adjust the finger positioning. An average classification accuracy of 84% was observed in a series of trials, showing that ReSkin combined with a machine learning classifier is an effective tool. It was noted, however, that the system became less effective as the number of cloth layers increased.

ReSkin has shown that it may be a better path forward for tactile sensing, providing accurate results where other methods β€” especially optical sensing β€” are not well suited for the task. The scope of this initial work was fairly limited, but the team hopes that their work will inspire future research that leads to robots that can perform a wide variety of useful tasks through tactile sensing.

Nick Bild
R&D, creativity, and building the next big thing you never knew you wanted are my specialties.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles