Movies like Iron Man have made us dream about technologies such as smart assistants and controlling devices with just a wave of your hand. Back then, it all sounded sci-fi, but now we are getting closer to achieving it. Sensors are smarter, processors fit into almost anything, and machine learning keeps getting smarter. All of this inspired me to illustrate a magic wand wearable that enables us to make gestures with our hands and control devices.
Wearable devices are becoming popular as sensors are getting smaller, cheaper, and more energy efficient, while processors are becoming powerful enough to run machine learning directly on-device. This advancement is empowering tech companies and developers to build systems that can understand human actions and respond instantly. This project showcases how such technologies can be used to move beyond pressing buttons and switches, by opening up opportunities to innovate hands-free, portable, and context-aware solutions such as hands-free control of home appliances, assistive aid for people with limited mobility, and even simple industrial smart glove controls. While gesture detection has been well explored, this project focuses on applying AI and IoT together to build intelligent solutions that work in real-world scenarios.
With advancements in the hardware and software ecosystem, running on-device AI models on constrained devices such as microcontrollers is a challenge that has been solved and the challenge at hand is to develop innovative solutions that can use this technology. Microcontrollers have become more advances with dual processors, MB of flash and RAM while still requiring little power to run. Software libraries and frameworks allow us to optimize Machine Learning models so that they can fit in KB of flash enabling us to add intelligent operations to the basic of microcontrollers.
In a Machine Learning pipeline we need to fetch data to train a model. In this project the data is hand motion which is recorded by an IMU sensor. The complex tasks involve fetching data from an IMU and using it to generate a dataset for training a hand gesture classification model which is then required to be deployed to the microcontroller. With the Edge Impulse platform, these tasks have been simplified, enabling us to rapidly create efficient AI solutions. I would also like to express my sincere gratitude to the company for supporting me in carrying out this research project.
The project demonstrates how we can leverage motion sensing and Edge AI to control devices using hand gestures. A wearable device is developed to run an on-device machine learning model that detects and classifies hand gestures in real time. Once a gesture is recognized, a command is sent via BLE to an actuating device such as the Arduino Nano 33 BLE, which in turn controls relays and actuators to switch appliances, such as lights, ON or OFF. All inference happens locally on the wearable, allowing low latency, privacy, and energy efficiency. The AI model running on the wearable has been trained to detect circular and left-right motions of the hand, as well as distinguishing this gestures from random motion and no motion.
I utilized the low-cost XIAO nRF52840 Sense given that it has an onboard 6 DOF IMU (LSM6DS3TR-C), Bluetooth LE 5.2, ultra-Low power consumption, and its small form factor makes it a great fit for a wearable. The onboard IMU is used to detect hand motion and an AI model running on the MCU classifies the motion. The BLE connectivity is used to send commands to an actuating device, the Arduino Nano 33 BLE Sense, which has a relay connected to a GPIO and it is used to power electrical appliances such as lights. The specifications/features of the XIAO nRF52840 Sense are great given that the project's requirements are met in a compact, small form factor and low cost device.
The XIAO nRF52840 Sense sends 'commands' as characters via BLE and the Arduino Nano 33 BLE uses the received data to turn lights on and off, for demonstration purposes.
The next part of this documentation describes how to develop a similar wearable and the various configurations that I made. Want to test the wearable right away? You can jump to section '03. Deploying the model' to start building your wearable.
You can access the public project Edge Impulse project on this link: Gesture detection. All the source codes for the project files are available in this GitHub repository: gesture-control-AI-wearable.
Components and hardware configurationSoftware components:
- Arduino IDE
- Edge Impulse Studio account
Hardware components:
- XIAO nRF52840 Sense
- Arduino Nano 33 BLE
- USB-C and micro USB cables
- Personal computer (PC)
To ensure realistic training data, the XIAO nRF52840 Sense is first worn on the wrist during data collection, matching the intended wearable placement and gesture detection conditions. I designed a simple enclosure that can house the board and a LiPo battery. The case has a USB-C slot that will allow us to connect a PC to the XIAO nRF52840 Sense and collect IMU data during the data acquisition step of the project. I also attached wrist straps that I designed and had them printed in TPU material. After 3D printing the case, I soldered wires to the battery pads on the XIAO board allowing the board to be self-powered using a battery.
With the ability to now secure the XIAO nRF52840 Sense to a hand, we can now collect data and train a hand gesture classification model. The motion representative classes used for the project are:
- circle
- left-right
- random-motion
- idle
To collect IMU data from the XIAO nRF52840 Sense and create a dataset, we will utilize the Arduino sketch, XIAO_nRF52840_Sense_print_IMU_data.ino, to collect raw data from the board. The sketch continuously sends 3 axes of sensor data via Serial and our computer will forward the data to an Edge Impulse project.
On your personal computer (preferably a laptop) ensure that you have installed Arduino IDE and the Seeed nRF52 mbed-enabled Boards by Seeed Studio. Next, install the LSM6DS3 library which is used to obtain data from the IMU. While uploading the code to print raw IMU data, I encountered some compatibility issues which resulted in errors such as "class arduino:: MbedSPI has no member named 'setBitOrder'", and nevertheless after successful uploading of the code, the IMU data was 0. After debugging, I managed to get things working by using version 2.7.2 of the Seeed board and version 2.7.2 of the LSM6DS3 library.
Connect your XIAO board to your PC, select the 'Seeed XIAO BLE Sense - nRF52840' board followed by the appropriate port and upload the sketch. On a Serial terminal such as the Serial Monitor, you should see raw IMU data from the board and they will change when the board is moved. We are now ready to connect our XIAO board with Edge Impulse Studio.
The first step to connect to the XIAO board and retrieve data is to install the Edge Impulse CLI on your PC. This is a highly advanced software that is used to control local devices, act as a proxy to synchronize data for devices that don’t have an internet connection, and to upload and convert local files. In the project, we will utilize the edge-impulse-data-forwarder tool of the CLI to capture Serial data from the XIAO board and automatically upload them to an Edge Impulse project. You can follow the installation steps according to your computer's Operating System.Open Edge Impulse Studio and create a new project with your preferred name.
Have your XIAO nRF52840 Sense connected to your PC and no Serial terminal application accessing the device's COM port. Once the project has been created, ensure your PC has internet access and run the command below on a CLI to start the data forwarder:
edge-impulse-data-forwarderYou will be requested to enter your Edge Impulse account credentials and select the new project. If all is good, the data-forwarder will automatically detect 3 sensor axes and we can name them as Ax, Ay, Az.
Now, when we go to 'Data acquisition' in our Edge Impulse project, we will see that our XIAO board has been connected and the platform has detected the 3 sensor axes.
We can now start sampling hand gesture data. Go to 'Data acquisition' and on the 'Collect data' section, we need to first set the label/class for the data we are collecting. We can also configure the duration of sampling data using the 'Sample length (ms.)' setting. I set the sample length to 120 seconds durations. Click 'Start sampling' and the Edge Impulse Studio will automatically instruct the data forwarder running on your PC to start sampling IMU data via Serial and upload them to the project. Once the set duration of sampling has been completed, the sampled data will be visible under 'Dataset'.
In my case, I collected data while moving my hand in circular, left-right and random motions, allowing the resulting model to be more effective in recognizing the motions. I also collected data while the wearable was still enabling the model to also understand 'movements'. In total, I created a dataset of about 6 minutes with equal durations of hand motion in a circular, left-right, random and no motion states.
Looking at the accelerometer data from the IMU, we can observe that there is a data "trend" between the different gestures from a time series representation of motion data. This tells us that an AI model is capable of analyzing these trends and learning the gesture movements.
Before proceeding to process the IMU data, we can split the collected data into 1 second samples by clicking the '3-dot' vertical menu on each sample then choose 'Split Sample'. Click '+Add Segment' and then click the graph to create a segment. Repeat this until all the motion sample on the graph has been segmented. Finally, click 'Split' and you will see the samples as 1 second (or as configured by 'Set segment length (ms.)'.
Finally, the last step in creating our dataset is to split it for training and testing. The popular rule is 80/20 split and this indicates that 80% of the dataset is used for model training purposes while 20% is used for model testing. On Edge Impulse Studio project, we can click the red triangle with exclamation mark (as shown in the image below) and this will open an interface that suggests splitting our dataset. We then click the button 'Perform train / test split' on the interface that opens. This will open another interface that asks us if we are sure of rebalancing our dataset. We need to click the button 'Yes perform train / test split' and finally enter 'perform split' in the next window as prompted, followed by clicking the button 'Perform train / test split'.
After collecting data for our project, we can now train a Machine Learning model for the required gesture classification task. To do this, on the Edge Impulse project, we need to create an Impulse which is a configuration that defines the input data type, data pre-processing algorithm, and the Machine Learning model training.
The first step is to create an Impulse by clicking the 'Create impulse' button. In my case I set the input data to a window size and window increase of 1000ms (1 second). For processing the raw data and extracting meaningful features, I utilized Spectral Analysis. This algorithm is great for analyzing repetitive motion, such as data from accelerometers. It extracts the frequency and power characteristics of a signal over time. Finally, the processed data in each Impulse is fed to a classifier, the Classification learning block, and it is designed to learn patterns from data, and can apply these to new data. It is great for categorizing movement or recognizing audio. After setting the configuration for an Impulse, we click the “Save Impulse” button.
The next step is to configure the processing block and generate features from the training set. Under Impulse design, click 'Spectral features'. We can use the default DSP parameters or leverage the 'Autotune parameters' feature to automatically optimize them for better performance. In my case, I proceeded with the default parameters.
On the page, click 'Save parameters' followed by the 'Generate features' button. The features generation process will take some time depending on the size of the data. When this process is finished, the Feature explorer will plot the features. Note that the features are the output of the processing block, and not the raw data itself.
Looking at the features, we can observe that when utilizing Spectral Analysis the clusters for the 4 classes are well separated (meaningful) illustrating that a classifier can learn the differences.
The last step is to train the model. We click 'Classifier' and set the training parameters. I used 100 training cycles (epochs), a learning rate of 0.001. For a relatively small project such as this one, we can use a CPU as the training processor and in this case, the training process takes less than 10 minutes. Click 'Save & train' to start the training process.
The classifier is a simple fully connected feedforward neural network with an input layer, two dense layers and an output layer. In the input layer, each sample is represented as a vector of X numerical values (features) that come from the Spectral Analysis digital signal processing block. Next, the dense layers transform the features and learn patterns. Finally, the output layer produces a probability distribution of the given input over the 4 motion classes (prediction).
After the training process is complete, the resulting model in my case had an accuracy of 96% with a loss of 0.08. I chose this as an acceptable experimental performance and proceeded to test the model on unseen data.
When training our model, we used 80% of the data in our dataset. The remaining 20% is used to test the accuracy of the model in classifying unseen data. Before deploying the model, we need to verify that it has not overfit (crammed data), by testing it on new/unseen data. To test our model, we first click 'Model testing' then 'Classify all'. The model gave an impressive test performance of 96% and I chose this acceptable to further deploy the model to the XIAO nRF52840 Sense.
We will deploy the Impulse as an Arduino library - a single package containing all the processes for sampling IMU data for 1 second, processing the data and feeding the features to the classifier. We can include this package (Arduino library) in our own sketches to run the impulse locally on microcontrollers such as the XIAO nRF52840 Sense. Technically no programming is needed to deploy the Impulse as the Edge Impulse platform generates all the required source code and example sketches as well, this is awesome!
Ensure the Impulse is the active one and then click 'Deployment'. In the field 'Search deployment options' select Arduino library.
Since memory and CPU clock rate is limited for our deployment, we can optimize the model so that it can utilize the available resources on the nRF52840 (or simply, so that it can fit and manage to run on the SoC). Model optimization often has a trade-off whereby we decide whether to trade model accuracy for improved performance or reduced the model’s memory (RAM) use. Edge Impulse has made model optimization very easy with just a click. Currently we can get two optimizations: EON Compiler (gives the same accuracy but uses 54% less RAM and 61% less ROM) and TensorFlow Lite. To enable model optimizations, I selected the EON Compiler and Quantized (int8).
Click 'Build' and this will start a task for packaging your Impulse accordingly, and finally a zip file will be downloaded to your computer. Add the library to your Arduino IDE and open the customized inference sketch (available in the GitHub repository). The inference sketch continuously samples 1 second of IMU data and classifies the motion for each class. When circular and left-right motions are detected, the XIAO board sends a character (1 and 0 respectively) via BLE.
04. Programming IoT control boardAfter training the model and deploying it to the XIAO board, the remaining task is to program a BLE device such as the Arduino Nano 33 BLE to receive data from the XIAO board and in turn control the logic levels of a GPIO pin. We can use this logic to control devices such as relays which are in turn connected to appliances such as lights, fans, motors, etc. In my case, I connected a single channel relay to a GPIO of the Arduino Nano and the relay was connected to switch lights (Christmas lights of this season).This step is fairly simple. We need to connect a relay to a GPIO of the Arduino Nano and upload the BLE peripheral code to the board. In the code, the variable RELAY_PIN is used to control the set GPIO allowing us to use the logic levels to control a relay. After uploading the code to the Arduino Nano 33 BLE, we can experiment if everything is okay by making hand gestures with the XIAO board and we will see the onboard LED on the Arduino Nano 33 BLE toggle on and off.
Being in the Festive Season mood, I wired the relay to switch power to Christmas lights, showcasing how we can in turn use the logic to also control other devices.
ResultsFrom sci-fi to reality, we can now control devices and appliances with hand gestures. The wearable is fairly low-cost and it is impressive to see how we can easily develop such an application using the emerging AI and IoT technologies. Remote actuation infrastructures have been widely deployed, but integrating AI as the control logic is opening up new possibilities for more intelligent systems.Below is a demonstration of the wearable controlling Christmas lights using circular and left-right hand gestures and BLE actuation.
This project shows how you can leverage Edge AI and low-power IoT communication to build a simple, gesture control system that actually works in the real world. The wearable handles gesture recognition on-device, and uses BLE to control other nearby devices. This keeps everything fast and cost effective.
Currently, the wearable only recognizes a handful of gestures but we can expand the possible detections by collecting more data, retraining the model, and connecting more devices. The results showcase the potential of Edge-AI enabled wearables as human–machine interfaces. All the source code developed for this project and the wearable CAD designs are available in the GitHub repository.












Comments