MIT Researchers Develop New Grippers with a Humanlike Touch

The MIT CSAIL team came up with new ways to help robots perceive the objects they're interacting with.

Cabe Atwell
a month agoRobotics / Sensors

There are plenty of benefits that come with soft robotics, but they also have a number of limitations. One problem they pose is their lack of tactile sense. Most soft robots can’t sense the position of their fingers. Researchers at MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) propose two new grip applications to address the issue.

The first is an improved version of a gripper design that was demonstrated by CSAIL and Harvard last year. It has a hollow, cone shape made of three parts that surround items rather than clutch them. This allows the gripper to lift a wide range of household objects. The team improved the design by adding tactile sensors made from latex balloons which connect to pressure transducers. The sensors let the gripper pick up delicate objects, like potato chips, while identifying them, so it has a better understanding of what it’s grasping.

Silicon-adhered sensors — one which is located on the outer circumference of the gripper to measure its changing diameter, while the other four are attached to the inside to measure contact forces — experience internal pressure changes upon force or strain. These changes were measured and then used to train an object-detecting algorithm running on an Arduino Due. After numerous experiments, researchers found the gripper was able to pick up kettle chips without damage 80% of the time.

The team also proposed the GelFlex, a gripper made of soft, transparent silicone finger with one camera near the fingertip, a second in the middle, reflective ink on the front and side, and LED lights on the back. The cameras equipped with fisheye lenses, track the finger’s deformations, which train the AI models to capture information like bending angles and the shape and size of the objects it grabs. GelFlex’s design, which looks like those As-Seen-On-TV grippers, can pick up various objects, like a DVD case, a block of aluminum, and a Rubik’s cube.

Next, the team will work on improving the grippers’ sense of movement and tactile sensing algorithms. They also want to use vision-based sensors to estimate more complex singer configurations, like twisting or lateral bending. Their research will be presented virtually at the 2020 International Conference on Robotics and Automation.

Related articles
Sponsored articles
Related articles