This project provides a getting started guide for LeRobot with the Tria Vision AI-KIT 6490.
LeRobot OverviewLeRobot is an exciting step forward in democratizing robotics, driven by Hugging Face’s mission to make advanced AI accessible to everyone. By bringing the same open, collaborative spirit that transformed NLP and computer vision into the robotics world, LeRobot lowers the barrier to working with learning-based, real-world robots.
With a clean Python API, standardized robot interfaces, and ready-to-use datasets for imitation and reinforcement learning, LeRobot makes it possible to move from data collection to trained robotic policies with unprecedented ease. Built around reproducibility, openness, and hardware-agnostic design, it bridges modern ML tooling (PyTorch, Transformers) with physical robots—turning embodied AI from a niche research topic into something the broader community can truly build, share, and advance together 🚀
We will be using two of the LeRobot open-source hardware projects:
- SO-101 Follower Arm + SO-101 Leader Arm
- LeKiwi Mobile Base
There is a lot of documentation on how to assemble the LeKiwi mobile base and SO-101 arms.
- https://huggingface.co/docs/lerobot/so101#step-by-step-assembly-instructions
- https://huggingface.co/docs/lerobot/lekiwi#step-by-step-assembly-instructions
Normally, I would have purchased from Seeed Studio, but for some reason, I could not place an order for shipment to Canada.
I purchased my pair of SO-101 robotics arms, pre-assembled from WowRobo.
I still had to change the base of the follower arm for the base the fits onto the LeKiwi mobile base.
Here is a picture of my fully-assembled LeKiwi + SO-101 Robotic Arm:
The Vision AI-KIT 6490 is an advanced edge-AI development platform from Tria Technologies, built around a Qualcomm Dragonwing™ QCS6490 SMARC compute module.
It’s designed to support vision-centric AI applications and multi-camera processing in industrial, robotics, and embedded systems.
Prior to installing the LeRobot python API, we need to program the QIRP 1.6 v4 image on the Tria Vision AI-KIT 6490.
Use the following Startup Guide to program the QIRP 1.6 image to the QCS6490 Vision AI Kit:
This will provided instructions on how to program the latest version of the QIRP 1.6 image (visionai_6490_qirp_1.6_v4.zip):
After booting the Vision AI-KIT 6490 with the QIRP 1.6 image, you can perform a sanity check with the Out of Box demo:
Close the OOB demo by clicking on the "Exit" button.
Setup LeRobot on the Tria Vision AI-KIT 6490The hardware setup for the LeRobot integration with the Tria Vision AI-KIT 6490 is fairly straight forward, since all peripherals are available as USB.
I did hit the limit of number of USB ports, so had to use a USB hub to connect the following peripherals to the Vision AI-KIT 6490:
- USB keyboard
- USB mouse
- USB motor interface for LeKiwi + SO-101 Follower Arm
- USB motor interface for SO-101 Leader
- USB camera (front facing)
- USB camera (wrist mounted)
The following instructions are nearly identical to the official Hugging Face instructions, which a few exceptions:
On the Vision AI-KIT 6490's desktoip, click on the "Terminal" icon on the top left to launch a command window.
Start by installing Conda (Miniforge):
wget "https://github.com/conda-forge/miniforge/releases/latest/download/Miniforge3-$(uname)-$(uname -m).sh"
bash Miniforge3-$(uname)-$(uname -m).shCreate and activate a conda environment fore use with LeRobot:
conda create -y -n lerobot python=3.12
conda activate lerobot
conda install ffmpeg -c conda-forgeFinally, clone and install the LeRobot python API:
git clone https://github.com/huggingface/lerobot.git
cd lerobot
pip3 install -e .
pip3 install -e ".[feetech]"
pip3 install -e ".[lekiwi]"You are all set to use LeRobot on the Vision AI-KIT 6490 !
Calibrate the Follower and Leader armsBefore using the SO-101 robotic arms, they must be calibrated.
The calibration process is well documented by Hugging Face:
On the Tria Vision AI-KIT 6490, the SO-101 robotic arms will be mapped to the following USB controllers:
- /dev/ttyACM0 : SO-101 Follower Arm (if plugged first)
- /dev/ttyACM1 : SO-101 Leader Arm (if plugged second)
To calibrate the SO-101 Follower Arm, use the following command:
lerobot-calibrate
--robot.type=so101_follower --robot.port=/dev/ttyACM0 --robot.id=qcs6490_follower_armTo calibrate the SO-101 Leader Arm, use the following command:
lerobot-calibrate
--teleop.type=so101_leader --teleop.port=/dev/ttyACM1 --teleop.id=qcs6490_leader_armTele-Operate the Robotic ArmWith the calibration of the robotics arms done, we can now perform a tele-operation session, where the leader arm controls the follower arm:
lerobot-teleoperate
--robot.type=so101_follower --robot.port=/dev/ttyACM0 --robot.id=qcs6490_follower_arm
--teleop.type=so101_leader --teleop.port=/dev/ttyACM1 --teleop.id=qcs6490_leader_arm
--display_data=TrueWe can also perform a tele-operation session with the cameras:
lerobot-teleoperate
--robot.type=so101_follower --robot.port=/dev/ttyACM0 --robot.id=qcs6490_follower_arm
--teleop.type=so101_leader --teleop.port=/dev/ttyACM1 --teleop.id=qcs6490_leader_arm
--display_data=True
--robot.cameras="{
front: {type: opencv, index_or_path: 2, width: 640, height: 480, fps: 30},
wrist: {type: opencv, index_or_path: 4, width: 640, height: 480, fps: 30}
}"The "index_or_path" will depend on how the USB cameras enumerate on the Vision AI-KIT 6490. On my setup, the Front camera enumerated as:
- Front Camera (plugged in first) : /dev/video2 & /dev/video3
- Wrist Camera (plugged in second) : /dev/video4 & /dev/video5
Here is a video of the tele-operation session with cameras.
What Next ?The next steps would be to capture and record several tele-operation sessions for a particular task. For this dataset recording, the lerobot-record utility would be used:
This captured dataset would then be used to train a VLA (Visual Action Language) model to obtain a policy that can be run locally on the embedded hardware.
ConclusionIn this project, I described how to setup the LeRobot hardware to work with the Tria Vision AI-KIT 6490.
In the next project, I will explore how to integrate this into ROS2.
ReferencesHugging Face
- LeRobot Installation : https://huggingface.co/docs/lerobot/installation
- SO-101 Arm Assembly : https://huggingface.co/docs/lerobot/so101#step-by-step-assembly-instructions
- LeKiwi Assembly : https://huggingface.co/docs/lerobot/lekiwi#step-by-step-assembly-instructions
- SO-101 Arm Calibration : https://huggingface.co/docs/lerobot/so101#calibrate
2025/12/15 - Initial Version (using QIRP 1.6 v4)






Comments