This Robotic Arm Acts as a Helping Hand in the Lab

Ever wanted a third hand on your electronics bench? Well Mateo found a solution: an intelligent robotic arm that responds to voice commands.

Cameron Coward
5 months agoRobotics

We humans evolved to spend our days gnawing on sticks in caves. We certainly didn’t evolve to grow the third and fourth hands that always seem to be necessary when working at the bench in an electronics lab. You’re already holding a soldering iron in one hand and solder in the other, so how are you supposed to also hold a PCB and component-laden tweezers at the same time? It can’t be done and there are no solutions on the market. That’s why Mateo took a page from Tony Stark’s playbook to build this robotic arm that acts as a helping hand in the lab.

Mateo didn’t design the robotic arm itself and instead purchased an inexpensive kit on eBay. Like all robotic arms in the sub-$100 price range, it was basically just a small toy. It did come with servo motors for each of the six joints, but there wasn’t any kind of intelligent control. The operator had to control the movement of each individual joint manually. That’s fun for a few minutes, but it isn’t very useful. Mateo’s work focused on giving the robotic arm some true intelligence and the ability to do useful tasks, like J.A.R.V.I.S. in the MCU.

To smartify the robotic arm, Mateo turned to the new D-Robotics RDK X5, which they’re calling an “AI development board.” Really, it is a single-board computer (SBC) — similar to a Raspberry Pi 5, but with a bit more of a focus on AI and robotics. Compared to the Raspberry Pi 5, it has an eight-core CPU instead of a four-core (though the clock speed is slower), dual MIPI inputs, and an integrated CAN FD interface. That last bit is particularly nice for modern robot actuators, but Mateo didn’t need it for this project.

In this case, Mateo simply connected the D-Robotics RDK X5 SBC to the robotic arm’s servo motors through a servo driver board. He also added an ultrasonic sensor to detect things in front of the gripper.

Everything else was software. Mateo primarily relied on YOLO for computer vision and some unidentified machine learning model for voice recognition. Together, those let the robotic arm pick up and put down objects at Mateo’s request.

It isn’t clear how much capability the robot truly has, because Mateo only shows a couple of quick demonstrations and they don’t provide a lot of clarity. But it can at least close and open the gripper on command, which is better than the third hands that evolution failed to give us.

Cameron Coward
Writer for Hackster News. Proud husband and dog dad. Maker and serial hobbyist. Check out my YouTube channel: Serial Hobbyism
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles