Have you ever wondered how your smartwatch knows when you're walking, running, cycling, or even just standing still? How does it automatically start tracking your workout or count your steps without you telling it what you're doing?
Behind the scenes, it’s a blend of motion sensors and machine learning working together in real-time.
In this project, we dive into exactly how that works — by building a motion detection system from scratch using the Seeed Studio XIAO nRF52840 Sense. This tiny yet powerful board comes with a built-in accelerometer and can run trained Edge AI models right on the device, without the need for any cloud processing.
By collecting real-world motion data, training a model using Edge Impulse, and deploying it back to the microcontroller, this project demonstrates how wearable devices can detect whether you're walking, running, or standing — entirely on the edge.
Whether you're building a fitness tracker, gesture controller, or just exploring embedded AI, this project brings you one step closer to understanding the intelligent tech inside today's wearables.
Project Flow- Collect motion data using onboard accelerometer and gyroscope
- Train a classification model with Edge Impulse
- Deploy the model to the XIAO board
- Perform on-device inference to detect different types of motion in real-time
- Visit https://studio.edgeimpulse.com
- Sign up with your email and set a password.
- Once signed in, you’ll land on the Dashboard.
- Click on "Create new project"
- Name your project something like:
Motion Tracker - XIAO NRF52840
- Choose the "Classification" problem type.
- Click Create Project.
Required Software:
- Arduino IDE
- Node.js (LTS version)
- Edge Impulse CLI
Install Edge Impulse CLI:
- After installing Node.js, run the following in your terminal (CMD or PowerShell):
npm install -g edge-impulse-cli
Option A: If using Edge Impulse Data Forwarder (via Serial)
- Connect your XIAO board to the PC via USB.
- Flash an IMU sketch that outputs accelerometer values via Serial
- Open a terminal and run:
edge-impulse-data-forwarder
- The CLI will detect your device and prompt you to link it to your Edge Impulse project.
- Select the right COM port and select your project from the list.
Option B: If using Edge Impulse Firmware (for fully supported devices)
- If you’re using an officially supported board (like Arduino Nano 33 BLE Sense), you can flash the Edge Impulse firmware. But since XIAO is not officially supported out-of-the-box, Option A is preferred.
- Go to the "Data acquisition" tab in Edge Impulse Studio.
- Select Accelerometer as the sensor.
- Enter a label (like
standing
,walking
,running
) - Set the sample length to around 20000ms (20 seconds).
- Click Start sampling and perform that motion with the device during recording.
- Repeat for all motion types.
- After data is collected, Edge Impulse will prompt to split the 20s sample into 2s windows.
- Accept the suggestion and save the samples..
- Repeat this for all collected data.
- Go to Create Impulse tab.
- Add: - Time series data (input) - Spectral Analysis (feature extraction) - Keras Classification (learning block)
- Click Save Impulse.
- Click on "Spectral Features".
- Hit Generate Features.
- Wait for feature graph to load and verify that clusters appear grouped by label.
- Go to the "NN Classifier" tab.
- Choose “Unoptimized (float32)” mode for best compatibility.
- Click Start Training.
- Wait for training to finish and review model accuracy.
- Navigate to Model testing.
- Click Classify all to evaluate performance on your test dataset.
- Go to the Deployment tab.
- Choose Arduino Library.
- Click Build to download the
.zip
file.
- Open Arduino IDE.
- Go to:
Sketch > Include Library > Add .ZIP Library…
- Select the
.zip
you downloaded. - Open the example sketch:
<project-name>_inferencing.ino
- Upload to the board.
- Open Serial Monitor (
Ctrl+Shift+M
) at 115200 baud. - Perform motions — the model will predict based on trained data.
- You can also integrate an OLED display or LEDs to show real-time output.
This project is just the beginning of what's possible when you combine embedded systems, motion sensors, and edge AI.
From simply detecting whether someone is walking, running, or standing — this technology can evolve into powerful real-world applications, such as:
- Fitness trackers that adapt intelligently to your movement
- Fall detection systems for elderly care and safety
- Smart helmets or cycling gear that track posture and motion
- Gesture-based control systems for touchless interfaces.
- Context-aware smartphones or wearables that adapt based on your activity
- Rehabilitation monitoring tools to track patient progress after injury
- Industrial worker activity recognition for safety compliance
As hardware becomes smaller, faster, and more efficient — and with tools like Edge Impulse making AI model deployment accessible — these kinds of intelligent motion-aware systems will be found everywhere: on our wrists, in our homes, in hospitals, on factory floors, and beyond.
This project showcases how a tiny board like the XIAO nRF52840 Sense can lay the foundation for next-gen wearable intelligence. With some creativity and data, you can build anything — from a smart band to an AI-powered fitness companion.
Comments