Bi-Touch Trains a Dual-Arm Robot in the Art of Gentle Manipulation "Within a Couple of Hours"

Using an AI agent working with only the senses of touch and proprioception, these robot arms can handle even the most fragile of items.

Researchers from the University of Bristol's Department of Engineering Mathematics and Bristol Robotics Laboratory have developed an approach for giving dual-arm robots tactile sensitivity "close to human-level" β€” by using a simulation-trained artificial intelligence (AI) agent to interpret its surroundings and control its movement.

"With our Bi-Touch system, we can easily train AI agents in a virtual world within a couple of hours to achieve bimanual tasks that are tailored towards the touch," claims lead author Yijiong Lin of the team's work. "And more importantly, we can directly apply these agents from the virtual world to the real world without further training. The tactile bimanual agent can solve tasks even under unexpected perturbations and manipulate delicate objects in a gentle way."

To create Bi-Touch, the researchers put together a simulated environment populated with a pair of robot arms fitted with tactile sensing capabilities. A series of reward functions and a goal-update mechanism give the virtual robots a way to learn how to carry out various tasks through deep reinforcement learning β€” and then, suitably educated, the agent can then switch to controlling real robot arms.

"Our Bi-Touch system showcases a promising approach with affordable software and hardware for learning bimanual behaviors with touch in simulation, which can be directly applied to the real world," says co-author Nathan Lepora, professor at the University of Bristol, of the work. "Our developed tactile dual-arm robot simulation allows further research on more different tasks as the code will be open source, which is ideal for developing other downstream tasks."

The team was able to train the agent to use the arms collaboratively to pick up and manipulate objects β€” and without visual feedback. Instead, the agent operates while relying exclusively on an electronic sense of proprioception, the innate sense of where the robot arms are in space. In real-world testing, the approach proved its potential β€” even picking up an object as fragile as a single potato chip.

The team's work has been published under closed-access terms in the journal IEEE Robotics and Automation Letters; a preprint is available under open-access terms on Cornell's arXiv server.

Gareth Halfacree
Freelance journalist, technical author, hacker, tinkerer, erstwhile sysadmin. For hire: freelance@halfacree.co.uk.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles