This demonstration shows how to use the Tria Vision AI-KIT 6490 for robotics applications. More specifically, it shows how to use a Computer Vision based Hand Controller to control the LeRobot robotic arm.
In the previous demo, where we used a hand controller to control a wheeled vehicle in simulation. This time, we will control a real robotic arm, in addition to the simulated robotic arm.
Distributed ROS2 ApplicationThe demo is a distributed ROS2 application. This means that the demo runs on two different hardware devices:
- Vision AI Kit (QCS6490, running QIRP v1.6)
- Host PC (laptop or workstation, running Ubuntu 24.04)
Both devices are running ROS2 Jazzy, and interact with each other using ROS2's DDS protocol over ethernet.
The Host PC can optionally be used to perform a simulation of the robotic arm using Gazebo.
The Vision AI-KIT 6490 performs all of the CV-based hand controller functionality.
Images are captured from a USB camera (Logitech) on-device, then processed using MediaPipe palm detection and hand landmarks.
The hand landmarks are then fed into a PointNet (3D detection) model to recognize American Sign Language (ASL).
Finally, the detected signs are used to generate ROS2 commands that control 4 of the 6 Degrees of Freedom (DoF) on the LeRobot SO-101 robotic arm.
The hardware setup for the LeRobot integration with the Tria Vision AI-KIT 6490 is fairly straight forward, since all peripherals are available as USB.
You will want to make use of the 4 USB ports on the Vision AI-KIT 6490:
- USB keyboard
- USB mouse
- USB camera (Logitech)
- USB motor interface for LeKiwi + SO-101 Follower Arm
The demo installation involves setup on both hardware components:
- Embedded Platform (QCS6490)
- Host PC
The next sections will cover the installation steps
1 - Qualcomm Vision AI-KIT 6490 - Flash QIRP 1.6 imageUse the following Startup Guide to program the QIRP 1.6 image to the QCS6490 Vision AI Kit:
This will provided instructions on how to program the latest version of the QIRP 1.6 image (visionai_6490_qirp_1.6_v4.zip):
After booting the Vision AI-KIT 6490 with the QIRP 1.6 image, you can perform a sanity check with the Out of Box demo:
2 - Qualcomm Vision AI-KIT 6490 - Install hand_controllerIn order to install the robotics demo, open a command terminal and ensure the /usr directory is writeable:
mount -o remount,rw /usrNext, clone the latest version of the hand_controller repository:
cd /root
git clone https://github.com/AlbertaBeef/hand_controller
cd hand_controllerConfigure the ROS2 environment:
export HOME=/home
export ROS_DOMAIN_ID=0
source /usr/bin/ros_setup.sh && source /usr/share/qirp-setup.shBuild and Install the hand_controller ROS2 package, located under the ros2_ws sub-directory:
cd ros2_ws
colcon build
source install/setup.bash
cd ..To make the installation active on boot, add the following lines to the.bashrc
echo "source /root/hand_controller/ros2_ws/hand_controller/install/setup.bash >> ~/.bashrc3 - Qualcomm Vision AI-KIT 6490 - Launch the Robotics Demo part 1 (hand controller)Launch the hand controller portion of the ROS2 demo as follows:
ros2 launch hand_controller demo22_lerobot_part1_asl.launch.py use_imshow:=TrueYou will see the following User Interface appear on the monitor.
Refer to the following project to install the LeRobot package on the Vision AI-KIT 6490.
- https://www.hackster.io/AlbertaBeef/lerobot-getting-started-guide-for-tria-vision-ai-kit-6490-0ec4f5
The SO-101 Follower Arm will need to be calibrated as well, but this will be done in the next section, using the ROS2 package.
4.2 - Qualcomm Vision AI-KIT 6490 - Install LeRobot ROS2 packageStart by creating a clone of the "lerobot" conda environment we created in the previous section. This is recommended since the ROS2 package will prevent some of the LeRobot utilities from working, if needed.
conda deactivate
conda create --name lerobot_ros --clone lerobot
conda activate lerobot_rosInstall a new version of the numpy package as follows:
pip3 install numpyNow clone the "Lerobot_ros2" repository, build the "so101_hw_interface", and install as follows:
git clone https://github.com/AgRoboticsResearch/Lerobot_ros2
cd Lerobot_ros2
cd src
colcon build --packages-select so101_hw_interface
source install/setup.bashCalibrate the SO-101 Follower Arm as follows:
ros2 run so101_hw_interface so101_calibrateThe calibration data will be stored in the following location:
/opt/.so101_follower_calibration.yaml
You will need to copy the contents of this calibration data to the following location:
Lerobot_ros2/src/so101_hw_interface/config/so101_calibration.yaml
In my case, I was not able to move the shoulder pan joint during calibration, so I did not update this portion, but updated all the other joints.
The most important was the wrist joint, which would initialize in an incorrect position.
Here is what my final calibration data looks like:
generated: '2025-06-19T14:32:38.435515'
shoulder_pan:
range_min: 745
range_max: 3424
homing_offset: 2084
shoulder_lift:
range_min: 904
range_max: 3258
homing_offset: 2081
elbow_flex:
range_min: 870
range_max: 3058
homing_offset: 1964
wrist_flex:
range_min: 823
range_max: 3123
homing_offset: 1973
wrist_roll:
range_min: 146
range_max: 3966
homing_offset: 2056
gripper:
range_min: 2053
range_max: 3524
homing_offset: 2788If you prefer to use my modified calibration data, you can instead clone my fork of this repo:
git clone https://github.com/AlbertaBeef/Lerobot_ros2We are now ready to activate the motor bridge !
5 - Qualcomm Vision AI-KIT 6490 - Launch the Robotics Demo Part 2 (motor bridge)This section will launch the motor bridge, responsible for sending the joint states to the real SO-101 robotic arm.
NOTE : BE CAREFUL ABOUT YOUR ROBOTIC ARM's POSITION. IT WILL SUDDENLY MORE TO THE POSITION IDENTIFIED IN THE VIDEO BELOW
Launch the motor_bridge as follows:
ros2 run so101_hw_interface so101_motor_bridgeYou should see the following occur with your robotic arm:
First, the recommended specifications for the Host PC hardware are the following:
- 16G RAM
- Ubuntu 24.04
In order to install the Host PC, ensure that it is up to date:
sudo apt update && sudo apt upgrade
sudo apt install gitInstall Docker according to the instructions here:
The docker installation can be validated with the "hello-world" docker container:
$ docker run hello-world
Hello from Docker!
This message shows that your installation appears to be working correctly.
To generate this message, Docker took the following steps:
1. The Docker client contacted the Docker daemon.
2. The Docker daemon pulled the "hello-world" image from the Docker Hub.
(amd64)
3. The Docker daemon created a new container from that image which runs the
executable that produces the output you are currently reading.
4. The Docker daemon streamed that output to the Docker client, which sent it
to your terminal.
To try something more ambitious, you can run an Ubuntu container with:
$ docker run -it ubuntu bash
Share images, automate workflows, and more with a free Docker ID:
https://hub.docker.com/
For more examples and ideas, visit:
https://docs.docker.com/get-started/With docker successfully installed, clone the robotics docker repository:
cd $HOME
git clone https://github.com/AlbertaBeef/robotics_docker
cd robotics_dockerCreate a directory for the shared drive (persistent storage), or create a symbolic link to an existing directory you want to use for this purpose:
mkdir sharedThen navigate to the launch sub-directory, and install the launch script:
cd launcher
source ./install_launcher.shThis will create a launch_docker.sh script in your $HOME folder, as well as creating a launch icon on your desktop.
7.1 - Host PC - Installing LeRobotThe LeRobot must be installed on the Host PC as well.
Follow the following instructions from Hugging Face:
If connecting to the LeRobot hardware on the Host PC (not the case for this project), you may need to adjust the permissions for the USB controllers to the motor controller, as follows:
sudo usermod -a -G dialout $USERIf this does not work, you may also need to manually adjust the pemissions as follows:
sudo chmod a+rw /dev/ttyACM0
sudo chmod a+rw /dev/ttyACM17.2 - Host PC - Installing LeRobot ROS2 packageStart by creating a clone of the "lerobot" conda environment we created in the previous section. This is recommended since the ROS2 package will prevent some of the LeRobot utilities from working, if needed.
conda deactivate
conda create --name lerobot_ros --clone lerobot
conda activate lerobot_rosNow clone the "Lerobot_ros2" repository, build the "so101_follower_description", and install as follows:
git clone https://github.com/AgRoboticsResearch/Lerobot_ros2
cd Lerobot_ros2
cd src
colcon build --packages-select so101_follower_description
source install/setup.bash8 - Host PC - Launch Robotics Demo Part 3 (robot simulation)Part 3 of the demo can be launched as follows::
ros2 launch so101_follower_description display.launch.pyYou will see the following visualization:
- rviz : visualization of the SO-101 Robotic Arm
The following video describes how the three parts of the demo come together:
ResourcesTria Technologies
Hugging Face
- LeRobot Installation : https://huggingface.co/docs/lerobot/installation
- SO-101 Arm Assembly : https://huggingface.co/docs/lerobot/so101#step-by-step-assembly-instructions
- LeKiwi Assembly : https://huggingface.co/docs/lerobot/lekiwi#step-by-step-assembly-instructions
ASL Recognition using PointNet (by Edward Roe):
- Medium Article : ASL Recognition using PointNet and MediaPipe
- Kaggle Dataset : American Sign Language Dataset
- GitHub Source : pointnet_hands
Hand Controller (by Mario Bergeron):
- Hand Controller (python version) : hand_controller
- Blaze Utility (python version) : blaze_app_python
2025/12/22 - Initial Version (using QIRP 1.6 v4)







Comments