Kirigami-Inspired Cuts to a Common Material Give MIT's Soft Robot Trunk "Proprioception"

Taking a common but ill-suited conductive material then passing it through a laser cutter, researchers have developed a new, cheap, sensor.

Researchers from the Massachusetts Institute of Technology (MIT) have developed a means of giving a soft robot arm the sense of proprioception — awareness of its motion and position in three-dimensional space — using "sensorized" skin.

"We’re sensorizing soft robots to get feedback for control from sensors, not vision systems, using a very easy, rapid method for fabrication," explains co-first author Ryan Truby of the project. "We want to use these soft robotic trunks, for instance, to orient and control themselves automatically, to pick things up and interact with the world. This is a first step toward that type of more sophisticated automated control."

The proof-of-concept device resembles a robotic elephant trunk, capable of swinging around as well as extended and retracting. Where a traditional robot might rely on camera systems, it instead uses sensors across its skin — built in a way that makes mass production as simple as possible, while the data is fed through a deep learning system capable of filtering out the noise and building a picture of the trunk's current configuration and position.

"Think of your own body: You can close your eyes and reconstruct the world based on feedback from your skin," explains co-author Daniela Rus. "We want to design those same capabilities for soft robots."

The secret lies not in the material chosen for the skin - which is a simple off-the-shelf roll of common conductor - but in how it is shaped. "I found these sheets of conductive materials used for electromagnetic interference shielding, that you can buy anywhere in rolls," Truby recalls, which was then placed through a laser-cutter to create a pattern inspired by kirigami - a variant of origami, the art of paper folding, which allows for cuts to be made - in order that the material can be stretched.

While the material is described by Truby as a "non-ideal sensor," the deep learning system behind it — trained on data from a traditional motion capture system - is able to filter out the noise and can estimate the configuration with reasonable accuracy "for certain and steadier configurations." The next stage: Reducing the amount of training required, extending the system to capture the full range of motion, and improving accuracy for more complex movements.

More information on the team's work, which is to be published in the journal IEEE Robotics and Automation Letters, can be found on the MIT website.

Gareth Halfacree
Freelance journalist, technical author, hacker, tinkerer, erstwhile sysadmin. For hire: freelance@halfacree.co.uk.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles