This demonstration shows how to use the Tria Vision AI-KIT 6490 for robotics applications. More specifically, it shows how to use a Computer Vision based Hand Controller to control a wheeled vehicle robot.
In this project, we choose the LeKiwi mobile base.
Distributed ROS2 ApplicationThe demo is a distributed ROS2 application. This means that the demo runs on two different hardware devices:
- Vision AI Kit (QCS6490, running QIRP v1.6)
- Host PC (laptop or workstation, running Ubuntu 24.04)
Both devices are running ROS2 Jazzy, and interact with each other using ROS2's DDS protocol over ethernet.
The Host PC manages the "digital twin" simulation of the robot, in a virtual environment.
The Vision AI-KIT 6490 performs all of the CV-based hand controller functionality.
Images are captured from a USB camera (Logitech) on-device, then processed locally using Qualcomm AI runtime (QAIRT). The models used are version "v0.10 full" of the mediapipe palm detection and hand landmarks. The following graph shows how how the QCS6490's NPU was used to accelerate the models from 10fps to 80fps, leaving more CPU resources for the ROS2 application.
More details on how these models were accelerated can be found here:
The position of the hands with respect to two dials, generate commands that control our mobile robot:
The demo requires the following hardware components:
- Vision AI Kit 6490 (including power supply, heatsink)
- USB Webcam (Logitech HD Pro C920)
- DisplayPort monitor (or HDMI monitor with adapter)
- Host PC (laptop or workstation) running Ubuntu 24.04 (16 GB DDR recommended for Gazebo simulation)
- Ethernet router (or direct Ethernet connection)
- LeKiwi mobile base
The demo installation involves setup on both hardware components:
- Embedded Platform (QCS6490)
- Host PC
The next sections will cover the installation steps
0 - Set up LeKiwi RobotIf you purchased a LeKiwi robot kit (like those sold by Seeedstudio) there may be some assembly required.
- SeeedStudio-specific assembly instructions: https://wiki.seeedstudio.com/lerobot_lekiwi/#assembly
- General assembly instructions on HuggingFace: https://github.com/SIGRobotics-UIUC/LeKiwi/blob/main/Assembly.md
There is also some software installation required. This is described on HuggingFace.
1. Install LeRobot on your PC: pip install -e ".[lekiwi]"
2. Find the port for your servo driver: lerobot-find-port
3. Run lerobot-setup-motors to set up the wheel's motor controllers
Before you run lerobot-setup-motors, you will need to modify the script to target only the LeKiwi base robot.
- Edit the lekiwi.py script to look for only the wheel motors and exclude the robot arm motors: (the lekiwi.py script may have a different path depending on where it was installed)
~/lerobot/src/lerobot/robots/lekiwi/lekiwi.py
motors={
# # arm
# "arm_shoulder_pan": Motor(1, "sts3215", norm_mode_body),
# "arm_shoulder_lift": Motor(2, "sts3215", norm_mode_body),
# "arm_elbow_flex": Motor(3, "sts3215", norm_mode_body),
# "arm_wrist_flex": Motor(4, "sts3215", norm_mode_body),
# "arm_wrist_roll": Motor(5, "sts3215", norm_mode_body),
# "arm_gripper": Motor(6, "sts3215", MotorNormMode.RANGE_0_100),
# base
"base_left_wheel": Motor(7, "sts3215", MotorNormMode.RANGE_M100_100),
"base_back_wheel": Motor(8, "sts3215", MotorNormMode.RANGE_M100_100),
"base_right_wheel": Motor(9, "sts3215", MotorNormMode.RANGE_M100_100),
},After editing, run lerobot-setup-motors:
lerobot-setup-motors \
--robot.type=lekiwi \
--robot.port=/dev/ttyACM0 # <- paste the correct port herePlug the servo driver into one motor at a time. The motors now have their IDs defined and are ready to use.
1 - Qualcomm Vision AI-KIT 6490 - Flash QIRP 1.6 imageUse the following Startup Guide to program the QIRP 1.6 image to the QCS6490 Vision AI Kit:
This will provided instructions on how to program the latest version of the QIRP 1.6 image (visionai_6490_qirp_1.6_v4.zip):
After booting the Vision AI-KIT 6490 with the QIRP 1.6 image, you can perform a sanity check with the Out of Box demo:
First, make certain that changes on the Vision AI-KIT 6490 will be persistent:
mount -o remount,rw /usrWe need to install the QAIRT SDK on our board, which can be done using the following instructions.
First, we download and install version 2.40 of the QAIRT SDK:
export PRODUCT_SOC=6490 DSP_ARCH=68
wget https://softwarecenter.qualcomm.com/api/download/software/sdks/Qualcomm_AI_Runtime_Community/All/2.40.0.251030/v2.40.0.251030.zip
unzip v2.40.0.251030.zip
cd qairt/2.40.0.251030
source bin/envsetup.sh
export ADSP_LIBRARY_PATH=$QNN_SDK_ROOT/lib/hexagon-v${DSP_ARCH}/unsigned
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$QNN_SDK_ROOT/lib/aarch64-oe-linux-gcc11.2Next, we want to clone, build, and install the QAI App Builder:
git clone --branch v2.40.0 https://github.com/quic/ai-engine-direct-helper --recursive
cd ai-engine-direct-helper
python3 -m pip install wheel
python3 setup.py bdist_wheel
pip3 install dist/qai_appbuilder-2.38.0-cp312-cp312-linux_aarch64.whl3 - Qualcomm Vision AI-KIT 6490 - Install hand_controllerIn order to install the robotics demo, open a command terminal and install the following additional packages:
mount -o remount,rw /usr
pip3 install flask
pip3 install edge_impulse_linuxNext, clone the latest version of the hand_controller repository:
cd /root
git clone https://github.com/Avnet/hand_controller --recursive
cd hand_controllerIn addition to the cloned repository, we need to download the models that we plan on using. In this case the QAIRT models for QCS6490:
cd blaze_app_python/blaze_qairt/models
source ./get_qcs6490_models.sh
unzip blaze_qairt_models_qcs6490.zip
cd ../../..In order to validate to validate the installation of the "blaze_app_python" and the QAIRT models for QCS6490, plug in a USB camera and run the blaze_app_python demo application as follows:
cd blaze_app_python
python3 blaze_detect_live.py --pipeline=qairt_hand_v0_10_fullYou should see the following appear on the monitor's output:
You can press the 'y' key to activate the profile view, which will confirm that the v0.10 full version of the mediapipe models are running at over 80 fps.
Press the 'q' key to quit the demo application, and return to the /root/hand_controller directory:
cd ..Now that the accelerated mediapipe models are working with QAIRT, we want to set up the ROS2 package for the hand controller.
Configure the ROS2 environment:
export HOME=/home
export ROS_DOMAIN_ID=0
source /usr/bin/ros_setup.sh && source /usr/share/qirp-setup.shBuild and Install the hand_controller ROS2 package, located under the ros2_ws sub-directory:
cd ros2_ws
colcon build
source install/setup.bash
cd ..To make the installation active on boot, add the following lines to the.bashrc
echo "source /root/hand_controller/ros2_ws/hand_controller/install/setup.bash" >> ~/.bashrc4 - Qualcomm Vision AI-KIT 6490 - Install lekiwi_ros2In order to add support for the LeKiwi mobile base, install the following additional packages:
mount -o remount,rw /usr
pip3 install tqdm
pip3 install deepdiff
pip3 install feetech-servo-sdkNext, clone the hand_controller repository:
cd /root
git clone https://github.com/AlbertaBeef/lekiwi_ros2
cd lekiwi_ros2Build and Install the lekiwi_ros2 ROS2 package, located under the src sub-directory:
cd src
colcon build --packages-select lekiwi_hw_interface
source install/setup.bash
cd ..Plug in the LeKiwi mobile base's USB controller to the Vision AI-KIT 6490, connect 12V power supply, as illustrated below:
Ensure that the Vision AI-KIT 6490 can "see" the USB controller as follows:
root@qcs6490-visionai-kit:~/lekiwi_ros2/src# ls /dev/ttyACM*
/dev/ttyACM0Manually launch the lekiwi_motor_bridge to validate that it is working, as follows:
root@qcs6490-visionai-kit:~/lekiwi_ros2/src# ros2 run lekiwi_hw_interface lekiwi_motor_bridge --ros-args -p verbose:=True
[INFO] [1769470535.011698374] [lekiwi_motor_bridge]: Verbose : "True"
[INFO] [1769470535.012614650] [lekiwi_motor_bridge]: Topic name : "/cmd_vel"
[INFO] [1769470535.013404315] [lekiwi_motor_bridge]: Use SO_101 : "False"
[INFO] [1769470535.014017630] [lekiwi_motor_bridge]: Connecting to Feetech bus on /dev/ttyACM0 …
[INFO] [1769470535.028800698] [lekiwi_motor_bridge]: Motor bus connected and configured.
[WARN] [1769470535.047059603] [lekiwi_motor_bridge]: Calibration file /root/lekiwi_ros2/src/install/lekiwi_hw_interface/share/lekiwi_hw_interface/config/lekiwi_calibration.yaml not found – will capture offsets on first read.
[INFO] [1769470535.047972181] [lekiwi_motor_bridge]: Done !
[INFO] [1769470535.069711977] [lekiwi_motor_bridge]: Captured home offsets: {'left_wheel_joint': 984, 'rear_wheel_joint': 738, 'right_wheel_joint': 1707}In a separate command window, publish a Twist message to advance the LeKiwi mobile base as follows:
ros2 topic pub --once /cmd_vel geometry_msgs/msg/Twist "{linear: {x: 0.5, y: 0.0, z: 0.0}, angular: {x: 0.0, y: 0.0, z: 0.0}}"The LeKiwi motors should turn, and the lekiwi_motor_bridge should display the following verbose:
[INFO] [1769470540.846693732] [lekiwi_motor_bridge]: Base Twist : "geometry_msgs.msg.Twist(linear=geometry_msgs.msg.Vector3(x=0.5, y=0.0, z=0.0), angular=geometry_msgs.msg.Vector3(x=0.0, y=0.0, z=0.0))"
[INFO] [1769470540.852776214] [lekiwi_motor_bridge]: Base Action : "{'x.vel': 0.5, 'y.vel': 0.0, 'theta.vel': 0.0}"
[INFO] [1769470540.859177697] [lekiwi_motor_bridge]: Base Commands : "{'left_wheel_joint': -3000, 'rear_wheel_joint': 0, 'right_wheel_joint': 3000}"To stop the LeKiwi motors from turning, publish a NULL Twist command, as follows:
ros2 topic pub --once /cmd_vel geometry_msgs/msg/Twist "{linear: {x: 0.0, y: 0.0, z: 0.0}, angular: {x: 0.0, y: 0.0, z: 0.0}}"The LeKiwi motors should stop turning, and the lekiwi_motor_bridge should display the following verbose:
[INFO] [1769470545.464267524] [lekiwi_motor_bridge]: Base Twist : "geometry_msgs.msg.Twist(linear=geometry_msgs.msg.Vector3(x=0.0, y=0.0, z=0.0), angular=geometry_msgs.msg.Vector3(x=0.0, y=0.0, z=0.0))"
[INFO] [1769470545.469800021] [lekiwi_motor_bridge]: Base Action : "{'x.vel': 0.0, 'y.vel': 0.0, 'theta.vel': 0.0}"
[INFO] [1769470545.474950185] [lekiwi_motor_bridge]: Base Commands : "{'left_wheel_joint': 0, 'rear_wheel_joint': 0, 'right_wheel_joint': 0}"Press CTRL-C to exit the lekiwi_motor_bridge ROS2 node, and proceed with the remaining instructions.
To make the installation active on boot, add the following lines to the.bashrc
echo "source /root/lekiwi_ros2/src/install/setup.bash" >> ~/.bashrc5 - Qualcomm Vision AI-KIT 6490 - Launch the Robotics Demo (hand controller + motor bridge)Finally, launch the hand controller with the following convenience script:
source /root/hand_controller/ros2_ws/launch_robotics_demo_lekiwi_qairt_dials.shThis script will start by configuring the environment variables to use the monitor and graphical software stack:
export QMONITOR_BACKEND_LIB_PATH=/var/QualcommProfiler/libs/backends/
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/var/QualcommProfiler/libs/
export PATH=$PATH:/data/shared/QualcommProfiler/bins
export XDG_RUNTIME_DIR=/dev/socket/weston
export WAYLAND_DISPLAY=wayland-1Then it will configure the ROS2 environment:
export HOME=/home
export ROS_DOMAIN_ID=0
source /usr/bin/ros_setup.sh && source /usr/share/qirp-setup.sh
source /root/lekiwi_ros2/src/install/setup.bash
source /root/hand_controller/ros2_ws/install/setup.bashNext, comes the QAIRT specific initialization:
export PRODUCT_SOC=6490 DSP_ARCH=68
source /root/qairt/2.40.0.251030/bin/envsetup.sh
export ADSP_LIBRARY_PATH=$QNN_SDK_ROOT/lib/hexagon-v${DSP_ARCH}/unsigned
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$QNN_SDK_ROOT/lib/aarch64-oe-linux-gcc11.2Finally, the ROS2 application will be launched as follows:
ros2 launch hand_controller demo13_lekiwi_part1_qai2dials.launch.py verbose:=False use_flask:=False use_imshow:=False x_t:=0.20 x_a:=0.0 x_b:=10.0 z_t:=0.20 z_a:=0.0 z_b:=2.0 threshold_detector_minscore:=0.6 threshold_landmark_confidence:=0.6 | ros2 run hand_controller gtk_gui_nodeYou will see the following User Interface appear on the monitor.
Also running on the Vision AI-KIT 6490 is the motor bridge that is translating Twist commands to Joint State for the LeKiwi mobile robot:
Here is the rqt_graph view of the ROS2 nodes that are launched on the Vision AI-KIT 6490:
First, the recommended specifications for the Host PC hardware are the following:
- 16G RAM
- Ubuntu 24.04
In order to install the Host PC, ensure that it is up to date:
sudo apt update && sudo apt upgrade
sudo apt install gitInstall Docker according to the instructions here:
The docker installation can be validated with the "hello-world" docker container:
$ docker run hello-world
Hello from Docker!
This message shows that your installation appears to be working correctly.
To generate this message, Docker took the following steps:
1. The Docker client contacted the Docker daemon.
2. The Docker daemon pulled the "hello-world" image from the Docker Hub.
(amd64)
3. The Docker daemon created a new container from that image which runs the
executable that produces the output you are currently reading.
4. The Docker daemon streamed that output to the Docker client, which sent it
to your terminal.
To try something more ambitious, you can run an Ubuntu container with:
$ docker run -it ubuntu bash
Share images, automate workflows, and more with a free Docker ID:
https://hub.docker.com/
For more examples and ideas, visit:
https://docs.docker.com/get-started/With docker successfully installed, clone the robotics docker repository:
cd $HOME
git clone https://github.com/AlbertaBeef/robotics_docker
cd robotics_dockerCreate a directory for the shared drive (persistent storage), or create a symbolic link to an existing directory you want to use for this purpose:
mkdir sharedThis will be used by the docker container for persistent storage. Anything you want to keep after your docker session, should be placed in this "shared" directory.
7 - Host PC - Launch Robotics Demo (robot simulation)Finally, launch the launcher script by double-clicking on the icon, or manually as follows:
~/robotics_docker/launcher/launch_docker_lekiwi.shThis will pull the pre-built docker container from docker hub, if not done so already, then launch it with part 2 (robot simulation) of the robotics demo.
You will see four visualizations:
- rviz : visualization of the wheeled vehicle robot
- gazebo : visualization of the wheeled vehicle robot in a virtual environment
- hand_controller/annotations : visualization of hand controller results (from Vision AI-KIT 6490)
- lekiwi front camera : visualization from LeKiwi's front facing camera
The robot simulation includes a front-facing camera on the LeKiwi mobile base robot.
You will notice a Tria poster on one of the hanger doors. You can scan the QR-Code, which will link to the following product page:
Putting It All TogetherThe following video describes how the three parts of the demo come together:
If the simulation is launched on the Host PC, without the Hand Controller running on the QCS6490 board, the RVIZ2 visualizer will return the following errors:
This is caused by RVIZ not being able to resolve its initial conditions, and is related to not yet having any Twist messages being published.
This does not prevent the Gazebo simulation from working, so was not resolved. If the hand controller is launched before the simulation is launched, the issue will not occur.
ResourcesTria Technologies
Hugging Face
- LeRobot Installation : https://huggingface.co/docs/lerobot/installation
- SO-101 Arm Assembly : https://huggingface.co/docs/lerobot/so101#step-by-step-assembly-instructions
- LeKiwi Assembly : https://huggingface.co/docs/lerobot/lekiwi#step-by-step-assembly-instructions
Hand Controller (by Mario Bergeron):
- ROS2 package for LeKiwi : lekiwi_ros2
- Hand Controller (python version) : hand_controller
- Blaze Utility (python version) : blaze_app_python
QAIRT & QAI App Builder (by Radxa):
- QAIRT SDK Installation : https://docs.radxa.com/en/dragon/q6a/app-dev/npu-dev/qairt-install
- QAI APP Builder : https://docs.radxa.com/en/dragon/q6a/app-dev/npu-dev/qai-appbuilder
- 2026/02/02 - Initial Version (using QIRP 1.6 v4)










Comments