In an earlier project I developed a self-balancing robot.
The robot could be controlled by a joystick. In this project the robot is controlled by gestures. The gestures are recognized by a neural network. The neural network is based on TensorFlow Lite.
TensorFlow Lite is TensorFlow's lightweight solution for mobile and embedded devices. It lets you run machine-learned models on mobile devices like Arduino.
My sketch was implemented on a Arduino Nano BLE Sense. The board has a built-in nine-axis IMU. This sensor module contains a 3-axis gyroscope, a 3-axis accelerometer, and a 3-axis magnetic sensor.
For training with TensorFlow, the desired gestures are first recorded with the gyroscope and the accelerometer (Sketch IMU_Capture.ino). Afterwards the training takes place with the data on the PC in Jupyter Notebook or in GoogleColab with Python. GoogleColab provides a Jupyter notebook that allows us to run our machine learning model in a web browser. After the training of the network, the model will be Convert to TensorFlow Lite. To use a model with TensorFlow Lite, you must convert a full TensorFlow model into the TensorFlow Lite format.
In the end, the converted model is copied into the Arduino header file of the sketch IMU_Classifier.ino.
The necessary steps are explained very well in the TinyML articles from Sandeep Mistry and Don Coleman.
Templates for the sketches are in GitHub:
- ArduinoSketches/IMU_Capture/IMU_Capture.ino
- ArduinoSketches/IMU_Classifier/IMU_Classifier.ino
I've modified these sketches for my purposes.
The Sketch IMU_Classifier.ino uses the on-board IMU to start reading acceleration and gyroscope data from on-board IMU, once enough samples are read, it then uses a TensorFlow Lite (Micro) model to try to classify the movement as a known gesture. After a gesture is recognized, the X and Y coordinates are sent to the self balancing robot. Unfortunately I could not use Bluetooth BLE from Arduino Nano, because this is not compatible to the HC-05 Modul on my DUE Board with the self balancingrobot. That's why I used a HC-05 Bluetooth modul.
In the following examples shows the Caputre of gestures.
Graph index 0 for file'forward' Acceleration
Graph index 0 for file'forward'Gyroscope
Link: My Jupyter Notebook at Google Colab. You can run it in Google Colaboratory.





Comments