This demonstration shows how to use the Tria Vision AI-KIT 6490 for robotics applications. More specifically, it shows how to use a Computer Vision based Hand Controller to control a wheeled vehicle robot.
Distributed ROS2 ApplicationThe demo is a distributed ROS2 application. This means that the demo runs on two different hardware devices:
- Vision AI Kit (QCS6490, running QIRP v1.6)
- Host PC (laptop or workstation, running Ubuntu 24.04)
Both devices are running ROS2 Jazzy, and interact with each other using ROS2's DDS protocol over ethernet.
The Host PC represents a robot, which in this case is simulated with Gazebo, but could just as well be represented with a different simulator or an actual robot.
Demo RequirementThe demo requires the following hardware components:
- Vision AI Kit 6490 (including power supply, heatsink)
- USB Webcam (Logitech HD Pro C920)
- DisplayPort monitor (or HDMI monitor with adapter)
- Host PC (laptop or workstation) running Ubuntu 24.04 (16 GB DDR recommended for Gazebo simulation)
- Ethernet router (or direct Ethernet connection)
- USB-C cable for setup & debug
The demo installation involves setup on both hardware components:
- Embedded Platform (QCS6490)
- Host PC
The next sections will cover the installation steps
1 - Qualcomm Vision AI-KIT 6490 - Flash QIRP 1.6 imageUse the following Startup Guide to program the QIRP 1.6 image to the QCS6490 Vision AI Kit:
This will provided instructions on how to program the latest version of the QIRP 1.6 image (visionai_6490_qirp_1.6_v4.zip):
After booting the Vision AI-KIT 6490 with the QIRP 1.6 image, you can perform a sanity check with the Out of Box demo:
In order to install the robotics demo, open a command terminal and install the following additional packages:
mount -o remount,rw /usr
pip3 install flask
pip3 install edge_impulse_linuxNext, clone the "GTKGUI-6490-v2" tag of the hand_controller repository:
cd /root
git clone https://Avnet/hand_controller
cd hand_controller
git checkout GTKGUI-6490-v2Build and Install the hand_controller ROS2 package, located under the ros2_ws sub-directory:
cd ros2_ws
colcon build
source install/setup.bash
cd ..To make the installation active on boot, add the following lines to the.bashrc
echo "source /root/hand_controller/ros2_ws/hand_controller/install/setup.bash >> ~/.bashrc3 - Qualcomm Vision AI-KIT 6490 - Launch the Robotics Demo (hand controller)Finally, launch the hand controller with the following convenience script:
source /root/hand_controller/ros2_ws/launch_robotics_demo.shThis script will start by configuring the environment variables to use the monitor and graphical software stack:
export QMONITOR_BACKEND_LIB_PATH=/var/QualcommProfiler/libs/backends/
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/var/QualcommProfiler/libs/
export PATH=$PATH:/data/shared/QualcommProfiler/bins
export XDG_RUNTIME_DIR=/dev/socket/weston
export WAYLAND_DISPLAY=wayland-1Then it will configure the ROS2 environment:
export HOME=/home
export ROS_DOMAIN_ID=0
source /usr/bin/ros_setup.sh && source /usr/share/qirp-setup.sh
source /root/hand_controller/ros2_sw/install/setup.bashFinally, the ROS2 application will be launched as follows:
ros2 launch hand_controller demo11_mogiros_car_part1_ei1dials.launch.py verbose:=False model:=/root/hand_controller/hands-v2-yolov5-linux-aarch64-qnn-v36.eim use_flask:=False use_imshow:=False x_t:=0.20 z_t:=0.20 | ros2 run hand_controller gtk_gui_nodeYou will see the following User Interface appear on the monitor.
First, the recommended specifications for the Host PC hardware are the following:
- 16G RAM
- Ubuntu 24.04
In order to install the Host PC, ensure that it is up to date:
sudo apt update && sudo apt upgrade
sudo apt install gitInstall Docker according to the instructions here:
The docker installation can be validated with the "hello-world" docker container:
$ docker run hello-world
Hello from Docker!
This message shows that your installation appears to be working correctly.
To generate this message, Docker took the following steps:
1. The Docker client contacted the Docker daemon.
2. The Docker daemon pulled the "hello-world" image from the Docker Hub.
(amd64)
3. The Docker daemon created a new container from that image which runs the
executable that produces the output you are currently reading.
4. The Docker daemon streamed that output to the Docker client, which sent it
to your terminal.
To try something more ambitious, you can run an Ubuntu container with:
$ docker run -it ubuntu bash
Share images, automate workflows, and more with a free Docker ID:
https://hub.docker.com/
For more examples and ideas, visit:
https://docs.docker.com/get-started/With docker successfully installed, clone the robotics docker repository:
cd $HOME
git clone https://github.com/AlbertaBeef/robotics_docker
cd robotics_dockerCreate a directory for the shared drive (persistent storage), or create a symbolic link to an existing directory you want to use for this purpose:
mkdir sharedThen navigate to the launch sub-directory, and install the launch script:
cd launcher
source ./install_launcher.shThis will create a launch_docker.sh script in your $HOME folder, as well as creating a launch icon on your desktop.
5 - Host PC - Launch Robotics Demo (robot simulation)Finally, launch the launcher script by double-clicking on the icon, or manually as follows:
~/launch_docker.shThis will pull the pre-built docker container from docker hub, if not done so already, then launch it with part 2 (robot simulation) of the robotics demo.
You will see two visualization utilities:
- rviz : visualization of the wheeled vehicle robot
- gazebo : visualization of the wheeled vehicle robot in a virtual environment
Remember, this portion of the distributed ROS2 system could be an actual robot. We are using the Gazebo simulator for convenience.
Using the alternative Flask GUIIf you DO NOT have a monitor, you can use the Flask GUI instead, as shown below:
Instead of using the provided "launch_robotics_demo.sh" script, use the following commands instead (or copy them in a different launch script) :
export HOME=/home
export ROS_DOMAIN_ID=0
source /usr/bin/ros_setup.sh && source /usr/share/qirp-setup.sh
source /root/hand_controller/ros2_sw/install/setup.bash
ros2 launch hand_controller demo11_mogiros_car_part1_ei1dials.launch.py verbose:=False model:=/root/hand_controller/hands-v2-yolov5-linux-aarch64-qnn-v36.eim use_flask:=True use_imshow:=False x_t:=0.20 z_t:=0.20With the flask GUI enabled, you can connect with the embedded platform's IP address : {IP address}:5001 :
If you are curious what the x_t | z_t parameters are all about, refer to the following section detailing all the parameters for the Edge Impulse controlled dials implementation.
Hand Controller - DialsThis section describes how the hand controller interprets the dials to generate twist commands to control our robot.
There are two dials which generate delta values, based on the position of the hand with respect to the dial's circle:
- delta_xy : delta value for left (xy) dial
- delta_z : delta value for right (z aperture) dial
The hand control can make use of one (1) dial or two (2) dials, and is specified with the following parameter:
- dials_single:=True (use only left dial)
- dials_single:=False (use both dials) <= default
When the hand controller is configured to use a single dial, the following hand positions will generate twist commands:
When the hand controller is configured to use both dials, the following hand positions will generate twist commands:
The way in which the delta values are converted to twist commands is also configurable:
If recorded myself providing a high-level overview of the demo and performing the installation steps:
Resources- [Tria Technologies] Vision AI-KIT 6490 Product Page
- [Tria Technologies] Vision AI-KIT 6490 Startup Guide v1.4
- [Edge Impulse] Tria Vision AI-KIT 6490
- [Edge Impulse] Hands-V2 model
This demo has many components working together.
I want to acknowledge the extraordinary work of my colleagues for pulling all these pieces together:
- Lucas Keller : for the initial demo vision and prototype for the hand-controlled dials interface
- Mario Bergeron : for connecting Lucas' dials to ROS2 twist commands
- Monica Houston : for her work on gathering data for the edge impulse model, polishing the demo for public viewing, and the initial Flask-based user interface
- Maxim Saka : for the immense work involved in creating the QIRP images, and creating the GTK user interface for the final demo
- Peter Fenn : for the overall project management for the Vision AI-KIT 6490
- Brennan Dayberry and Joshua Buck : for their awesome collaboration at Edge Impulse
2025/12/08 - Initial Version (using QIRP 1.6 v3)
2025/12/09 - Update for new version of Getting Started Guide (v1.4) and QIRP 1.6 (v4)









Comments