In the world of robotics and AI, the bridge between theoretical algorithms and the physical world—known as embodied intelligence—has often been complex and expensive to cross. The Hiwonder LeRobot SO-ARM101 changes that. It’s not just another robotic arm kit; it’s a fully open-source, dual-arm platform designed from the ground up for imitation learning, seamlessly integrated into the Hugging Face LeRobot ecosystem.
If you're holding this kit, you're holding a key to one of the most accessible gateways into real-world AI robotics. This guide will walk you through unboxing, setup, and running your first AI-powered task.
First, understand what you have. The SO-ARM101 is a leader-follower system. You physically teach the Leader Arm, and the Follower Arm learns to mimic the task autonomously.
Key Hardware Components:
Leader & Follower Arm Structures: 3D-printed, highly optimized for smooth motion.
Servos: 12 custom high-torque (30kg.cm) magnetic-encoder bus servos (6 per arm). Crucially, the Leader Arm’s first three joints use different gear ratios (1:191, 1:345, 1:191) for better performance.
BusLinker V3.0 Debug Board: Your control hub for servo configuration and debugging.
Dual-Camera System: A gripper-mounted "eye-in-hand" camera for close-up detail and an external "third-person" camera for global scene awareness.
Power Supplies: You must use the correct voltage to avoid damage. (More on this critical point below).
System Requirements (Software):
- Operating System: Ubuntu 22.04 (x86) or Jetson Orin with JetPack 6.2.
- Core Software: Python 3.10, CUDA 12+, PyTorch 2.6.
- ⚠️ A Vital Note on Software Version: Hugging Face has rolled out a major LeRobot upgrade. For stability and compatibility with this guide, use the stable tutorial repository (maintained as of June 2025) for your initial setup, not the bleeding-edge main branch. This ensures all commands and dependencies work as described here.
Step 1: Hardware Assembly & The CRITICAL Power Rule
Assemble the arms using the provided guides. When connecting power, this is the single most important step:
- Leader Arm (all 7.4V servos): Always use the 5V power supply.
- Follower Arm (12V servos in Pro Kit): Use the 12V power supply. Mixing these up will instantly burn your servos. Label your cables and arms clearly.
Connect the BusLinker boards to your computer via USB (this is for data only; power comes separately).
Step 2: System Environment & Installing LeRobotWe'll use Miniconda to manage a clean Python environment.
bash
# Install Miniconda (Ubuntu x86 example)
mkdir -p ~/miniconda3
wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh -O ~/miniconda3/miniconda.sh
bash ~/miniconda3/miniconda.sh -b -u -p ~/miniconda3
source ~/miniconda3/bin/activate
conda init
# Create and activate the LeRobot environment
conda create -y -n lerobot python=3.10
conda activate lerobot
# Clone the STABLE tutorial repository (Critical!)
git clone https://github.com/huggingface/lerobot-tutorial-stable.git ~/lerobot
cd ~/lerobot
# Install LeRobot with Feitech servo support
pip install -e ".[feetech]"Step 3: Motor Configuration & CalibrationFirst, find which USB port corresponds to each arm. With one arm powered and connected, run:
bash
python lerobot/scripts/find_motors_bus_port.pyNote the port (e.g., /dev/ttyACM0), then repeat for the other arm.
Calibration tells the software the physical limits of each joint. The process is interactive in the terminal. Because the Leader Arm's joints 1-3 have unique gear ratios, pay close attention to on-screen prompts to calibrate them correctly. The script will move each joint to its limits and ask you to confirm.
3. The Magic: Your First Imitation Learning ProjectWith setup complete, let’s make the robot learn a simple pick-and-place task.
Phase 1: Data Collection – Teaching by Doing
Concept: You will use the Leader Arm to perform the task 20-30 times. The system records everything.
Process:
1. Position the target object (e.g., a colored cube) in the workspace.
2. Run the data collection script:
python scripts/collect_demonstrations.py --task pick_and_place.
- Run the data collection script:
python scripts/collect_demonstrations.py --task pick_and_place.
3.Physically grab and manipulate the Leader Arm through the complete task: approach, grasp, lift, move, release.
4. The system synchronously records:
All servo joint angles.
Video from the gripper camera (detailed view).
Video from the external camera (contextual view).
5. Repeat for multiple demonstrations. More data = a better model.
Phase 2: Model Training – Creating the "Brain"
Concept: The LeRobot framework uses your demonstration data to train an Action Chunking Transformer (ACT) model, a state-of-the-art imitation learning algorithm.
Process:
- The data is automatically formatted into a dataset.
- Launch training with a command like:
bash
python train.py policy=act_cnn dataset=pick_place_demo3. Training runs on your GPU. You can monitor the loss, which should decrease over time, meaning the model is learning to replicate your actions. Thanks to Hugging Face integration, you could even start from a shared pre-trained model to speed this up.
Concept: Deploy the trained model to the Follower Arm for live inference.
The Moment of Truth:
1. Run the deployment script:
python scripts/eval_policy.py --checkpoint ./outputs/your_model.ckpt.
2. The Follower Arm will spring to life. Using only its dual-camera vision to locate the object and its trained model to decide actions, it will attempt to execute the pick-and-place task.
3.It won't be perfect on the first try, but this is embodied AI in action—a physical system perceiving, deciding, and acting.
Use the BusLinker Software: The BusLinker V3.0 GUI is excellent for real-time servo monitoring, testing individual movements, and troubleshooting connection issues.
Common Pitfalls:
- "Servos not found": Check USB permissions on Linux (
sudo chmod 666 /dev/ttyACM0), and double-check power/data cable connections. - Jittery motion: Ensure the mechanical structure is fully tightened and that you're using the correct, stable power supply.
- Training errors: Verify your Conda environment (
lerobot) is active and all dependencies installed.
Where to Go From Here:
- Experiment: Try different tasks (stacking, pushing, drawing).
- Explore Hugging Face Hub: Download community datasets and models to try immediately.
- Dive Deeper: Modify the ACT model architecture, or experiment with reinforcement learning after initial imitation learning.
- Contribute: Share your own datasets and trained models back to the Hugging Face community to help others.
Hiwonder SO-ARM101 demystifies embodied AI. In one weekend, you can go from unboxing to having a robot that learns from you. Its true power lies in its open-source philosophy and deep Hugging Face integration, connecting you directly to a global community of innovators.
This isn't just about building a robot; it's about building the future, one demonstration at a time. Now, go and teach it something amazing. What will you teach your SO-ARM101 first? Share your projects and questions in the comments below!







Comments