Welcome, fellow makers and robotics enthusiasts!
This short tutorial will guide you through running a LeRobot training session using the SO-ARM101 arm kit on the NVIDIA Jetson AGX Orin Developer Kit. For those following the LeRobot initiative, you'll know its goal is to make advanced robotics accessible, and the SO-ARM101 serves as an excellent entry point. Hugging Face recently launched the SO-101, an upgraded, low-cost, 3D-printable autonomous robot arm, which is a newer version of the SO-100.
Seeed Studio provides comprehensive documentation, including all necessary code and commands, for using the SO-ARM100 and SO-ARM101 robotic arms with LeRobot and for conducting training. You can find this documentation at: https://wiki.seeedstudio.com/lerobot_so100m/
My focus here will be on aspects not explicitly covered in their documentation. You can also explore the LeRobot page on the NVIDIA Jetson AI Lab: https://www.jetson-ai-lab.com/lerobot.html
The first step in any exciting build is, of course, the unboxing!
Here's what's inside of the box:
Take a moment to familiarize yourself with each component. This initial inspection helps ensure you have everything needed for a smooth assembly process.
Step 2: Upgrade Firmware of the motorsIt's very important to update the firmware of the servos to the latest and greatest version.
Make sure you follow the specific firmware update instructions provided by Seeed Studio for these motors.
- Confirm that all motors are connected properly and receiving signals.
- Connect the motors in a daisy chain to your Windows PC and upgrade them all to ensure a consistent version.
Now for the hands-on part! Follow the assembly instructions provided with your kit carefully. Seeed Studio's documentation is very detailed, with clear diagrams and step-by-step guidance.
Once assembled, the next exciting phase is manual teleoperation.
This crucial step allows you to test the arm's movement and controls directly, ensuring everything is functioning as expected before you dive into data collection.
Step 4: Prepare a Two-Camera SetupWe'll be using a two-camera setup to provide a comprehensive view of the arm's workspace.
One camera was positioned to capture a top view of the entire operational area, while the second camera was set up closer, providing a detailed, close-up perspective of the end-effector's interactions.
Step 5: Record datasetWith our two-camera setup in place and the arm ready for action, it was time to record the dataset. I collected 30 episodes for training, focusing on a specific task: controlling a "leader" arm (in my case, the SO-ARM101 under my direct control) to manipulate a "follower" arm (the one we're training the AI to control) to put a dice inside a box.
I concentrated on clear, precise movements, guiding the follower arm through the Pick & Place task, capturing every nuance of the interaction between the dice and the box.
This process, known as "imitation learning" is a powerful way to teach complex robotic behaviors without explicit programming. Training imitation learning algorithms, like diffusion policy, has become easier with recent open source libraries like Huggingface's LeRobot. The more diverse and accurate your demonstrations, the better your AI model will perform.
Step 6. TrainingWith our dataset collected, it's time for the magic of machine learning! Nvidia Jetson AGX Orin's powerful GPU is perfectly suited for the training process. I strongly recommend monitoring its training with Weights & Biases, so make sure to perform wandb login
beforehand.
The training itself took approximately 6 hours, 5 minutes, and 46 seconds.
Here's a glimpse of the training metrics output:
2025-07-05 17:18:12
INFO 2025-07-05 17:18:12 ts/train.py:232 step:48K smpl:386K ep:2K epch:53.88 loss:0.055 grdn:7.632 lr:1.0e-05 updt_s:0.447 data_s:0.0012025-07-05 17:18:12
2025-07-05 17:19:43
INFO 2025-07-05 17:19:43 ts/train.py:232 step:48K smpl:387K ep:2K epch:54.10 loss:0.056 grdn:7.753 lr:1.0e-05 updt_s:0.448 data_s:0.0042025-07-05 17:19:43
These metrics are crucial for monitoring the training process and ensuring the model is learning effectively.
The training appears to be progressing very well and is likely in a stable, convergent state.
Demonstration videoThis video demonstrates the SO-ARM101 arm, controlled by our trained AI model, performing the tasks it learned from our collected dataset. Seeing the arm autonomously pick up the dice and place it into the box, just as it was taught.
I hope this guide assists you on your own LeRobot journey with the SO-ARM101! The combination of accessible hardware like the SO-ARM101, powerful edge AI devices like the NVIDIA Jetson AGX Orin, and the intuitive LeRobot framework truly makes advanced robotics approachable for makers and enthusiasts alike.
Let me know in the comments if you have any questions.
Comments