This Robotic Arm Uses Machine Learning to Move Itself Intelligently Based on Motion Capture

By training a model using a motion capture setup, James Bruton was able to create an automatic prosthetic arm that thinks on it own.

The idea

Attempting to control robotics by using just a few sensors and encoders can be extremely difficult, as interpolating where the robot should end up based on limited data normally requires a lot of advanced math to accomplish. That is why maker and YouTuber James Bruton had the idea to combine traditional robotics techniques with the power of embedded machine learning to enable a robotic arm to learn and therefore move on its own from just a little bit of training.

Capturing motion data

Before Bruton could start training a model, he first needed to collect some body movement data. To do this, he created four identical sensor module units that each contain a single Teensy LC and an MPU-6050 six-axis inertial measurement unit (IMU) for getting the acceleration and rotation of the limb it is on. Just below the MPU6050 is an AMS AS5047 magnetic rotary encoder to record the current position of the rotational joint, such as an elbow or knee. The Teensy LC sends all of this data over the serial port to a much more powerful Teensy 4.1. There is also a Teensy LC + MPU6050 module mounted to the top of a headband for capturing where the user's head is positioned.

Although the data coming into the Teensy 4.1 is sufficient for some basic motion capture, it is nowhere near enough to be considered "accurate," as the Teensy LC/MPU6050 modules only capture two axes of motion each. However, it should work well enough for this project.

Training a machine learning model

With the motion capture figured out, it was time to train a machine learning model and deploy it onto a device. For this application, Bruton used the OgmaNeo AI software that is ordinarily used for robotic reinforcement learning, which is just what he needed. The data from each sensor unit was fed from the Teensy 4.1 to a Raspberry Pi Zero at a rate of 10Hz, as anything faster bogged down the Pi's CPU too much.

Once trained with a series of repetitive motions, the serial plotter shows that the emulated right arm is able to correctly predict what it should be doing based on the other inputs.

Building the robot arm

To take this virtual output of the machine learning model and transform it into physical motions, Bruton had to design and build his own robotic arm. It consists of three high-torque servo motors that give the arm three degrees of freedom, which were plenty in this application. A single Arduino MKR board controls the servos over a serial connection with the help of a DYNAMIXEL Shield.

How does the system perform?

After mounting the arm to a backpack mount, Bruton began to train the system and test it further. He added some additional filtering to the signals coming from the Teensy 4.1 that smooths out the rather rough 10Hz waveforms. The training performed with the model taught it to raise the arm when Bruton's left leg is up and lower it when his right leg/left arm is up. And as can be seen in his build log/demonstration video, it works, and quite well at that. By reducing some of the filtering, Bruton was able to get the arm to move a bit faster.

Future plans

Even though the arm already works well enough at simple tasks, B wants to improve it even further. He discovered that the IMU on his arm will give inaccurate accelerometer readings when being rotated, so adding extra axes of motion sensing can assist the robot when it tries to predict what to do from a range of different actions. Additionally, Bruton plans on adding EMG data which can let the robot "read" the user's mind and make movements from it.

You can find the code and design files here in this GitHub repository.

Evan Rust
Embedded Software Engineer II @ Amazon's Project Kuiper. Contact me for product reviews or custom project requests.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles