Human-robot interaction systems — systems in which humans and robots occupy the same physical space, and can manipulate the same objects — can lighten the workload for numerous tasks. For many uses, however, these systems are out of reach due to high cost and system complexity. These constraints can be especially limiting for the hobbyist or prototyper.
A new low-cost, open source human-robot interaction system has been developed by the Ubiquitous Computing Lab at the University of Siegen to address this gap in access.
The system was built with widely available hardware, including a 3D printable 3-DOF RRR robot arm, three MG996R servo motors to control arm movements, a Raspberry Pi 4 Model B (4GB), a stereo camera, and a Texas Instruments MSP430 FR5969 LaunchPad microcontroller. The entire bill of materials comes in under $250.
The Raspberry Pi is used to process the images captured from the stereo camera. A Python3 module, running on the Raspberry Pi is used to detect and track ArUCo 2D barcodes that are affixed to objects of interest. Once detected, objects are mapped into the robot’s coordinate systems with the use of Robot Operating System. This functionality serves as the basis for all robot-object interactions.
A stereo camera alone could easily break the budget for this lean build, so the engineers built their own from a pair of Logitech c525 webcams. The cameras were calibrated and rectified using the popular computer vision software OpenCV.
To complete the link between processing and actions in the physical world, the microcontroller was used as a UART to PWM connection to control the servos that drive the movements of the robotic arm.
It is always exciting to see expensive tools redesigned and made accessible to the hobbyist. With some luck, other hackers will run with this idea and extend it in the future. Might I suggest adding some type of grabber as a first step? Go!