AnyTeleop Lets You Control Robot Arms and Hands From Anywhere with Just a Camera

The system can track the operator's hand movements with any camera and without any calibration, researchers claim.

Gareth Halfacree
10 months ago β€’ Machine Learning & AI / Robotics

A team of researchers from the University of California San Diego and NVIDIA have come up with a vision-based system which offers high-dexterity teleoperation of robot arms and hands: AnyTeleop.

"Vision-based teleoperation offers the possibility to endow robots with human-level intelligence to physically interact with the environment, while only requiring low-cost camera sensors," the researchers explain of the background to their work. "However, current vision-based teleoperation systems are designed and engineered towards a particular robot model and deploy environment, which scales poorly as the pool of the robot models expanded and the variety of the operating environment increases."

The solution, they argue, is AnyTeleop β€” a general vision-based system for dexterous teleoperation of any robot arm and hand combination you can imagine, alongside any camera system. Better still, the team claims, it can outperform rivals optimized for a single arm, hand, and camera combination β€” and to prove it the team has done exactly that, with both real-world robotics and in simulation.

"Compared to the costly wearable hand tracking solutions, such as gloves, marker-based motion capture systems, inertia sensors, or VR headsets, vision-based hand tracking is particularly favorable due to its low cost and low intrusion to the human operator," the team explains. "AnyTeleop is designed for arbitrary dexterous arm-hand systems that are not limited to any specific robot type. [It] is decoupled from specific hardware drivers or physics simulators. [It] can consume data from both RGB [visible light] and RGB-D [depth-sensing] cameras, and from either single or multiple cameras.

In addition to its flexibility in terms of hardware, AnyTeleop is claimed to get to work instantly β€” requiring no direct contact with the user, no calibration, and without having to rely on depth data. It supports the control of multiple robot hands or arms or collaborative control from multiple human operators, and has been proven in both simulation and on real hardware.

"The experiments show that AnyTeleop outperforms previous systems in both simulation and real-world scenarios," the researchers conclude, "while offering superior generalizability and flexibility. Our commitment to an open-source approach will facility further research into the field of teleoperation.

A PDF copy of the team's paper on the work is available on the NVIDIA Research portal, under open-access terms.

Gareth Halfacree
Freelance journalist, technical author, hacker, tinkerer, erstwhile sysadmin. For hire: freelance@halfacree.co.uk.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles