Researchers from MIT have developed a sensor-packed glove that captures pressure signals when users handle different objects, which can then be used as datasets a neural network can use to identify what those objects are. The researchers say the technology could be used to provide robots and prosthetics a sense of touch.
The STAG (Scalable Tactile Glove) glove is coated with an electrically conductive polymer that changes resistance with applied pressure. Conductive threads were sewn into the material, and overlap each other in a way that transforms them into pressure sensors, which are positioned all over the glove. Around 550 of those sensors grab data signals when the user handles different objects in various ways and sends that data to a neural network that uses it to learn what those objects are and classifies them by their predicted weights and feels, without any visual references.
“Humans can identify and handle objects well because we have tactile feedback. As we touch objects, we feel around and realize what they are. Robots don’t have that rich feedback. We’ve always wanted robots to do what humans can do, like doing the dishes or other chores. If you want robots to do these things, they must be able to manipulate objects really well.” — Lead researcher Subramanian Sundaram
The researchers were able to compile a dataset for 26 everyday objects- including a spoon, scissors, mug, soda can, and a tennis ball, and predicted those objects with a 76% accuracy. Their platform can also predict the weight of most objects within 60-grams, however, they may combine their system with sensors that detect torque, which would predict those weights more accurately.