This project presents the development of an end-to-end smart wearable system designed to recognize and classify human physical activities in real time. The solution combines embedded systems, cloud-based infrastructure, machine learning, and a web-based application to form a complete IoT pipeline with low latency and dependable data flow.
System OverviewAt the device layer, the system is built around the RT-Thread RT-Spark development board powered by an STM32 microcontroller. Firmware was developed using RT-Thread Studio to configure and manage the onboard inertial sensors, including the accelerometer and gyroscope, along with the Wi-Fi communication module. Motion data is sampled at an optimized frequency to maintain accuracy while minimizing transmission overhead. The collected multi-axis sensor readings are structured into packets and transmitted wirelessly to the cloud at regular intervals.
Cloud Infrastructure and Data HandlingMango Cloud serves as the central cloud platform for device connectivity, data ingestion, and system coordination. Incoming sensor streams are authenticated, processed, and forwarded to the backend services. The server-side architecture was designed to handle continuous data inflow while simultaneously managing API requests, ensuring stable operation without data collisions or processing delays.
Activity Recognition and Machine LearningThe activity detection engine is driven by a lightweight machine learning model trained on motion sensor datasets. The model focuses on capturing short-term temporal features from accelerometer and gyroscope signals to accurately distinguish between different activity states. Once trained, the model was integrated into the backend to perform live inference on streaming sensor data. All predictions are logged with timestamps, enabling both real-time monitoring and long-term activity analysis.
Progressive Web Application (PWA)A responsive Progressive Web App was developed to visualize system outputs and interact with the cloud backend. The frontend is built using HTML and CSS for layout and styling, while JavaScript manages data retrieval, real-time updates, and visualization logic. The application communicates with Mango Cloud via RESTful APIs, allowing users to view current activity states as well as historical records. Asynchronous data handling ensures smooth user interaction during continuous data updates.
User Interface Design and System TransparencyThe mobile interface was intentionally designed to provide both clear activity feedback and insight into the system’s internal processing. Instead of presenting only a final classification, the UI exposes intermediate metrics that explain how sensor data is analyzed and interpreted.
The current detected activity—such as walking, running, or resting—is displayed prominently at the top of the screen, accompanied by an animated icon representing the motion state. A timestamp confirms when the classification was generated, highlighting the system’s real-time capabilities.
Each activity mode includes a circular visual indicator with distinct colors and animations to reinforce the detected state. Dynamic animations are used for active movements, while a static symbol represents inactivity, allowing users to quickly recognize transitions.
Below the activity label, two diagnostic values are shown: motion intensity and window accumulation status. The motion intensity metric reflects the calculated variance from recent sensor readings, where lower values indicate minimal movement and higher values correspond to more vigorous activity. Displaying this metric helps users understand the reasoning behind each classification.
The window accumulation indicator shows how many sensor frames have been collected relative to the required sequence length for inference. This illustrates the temporal buffering approach used by the model to produce stable and reliable predictions.
At the bottom of the interface, a live signal chart visualizes recent motion data. Peaks and fluctuations in the graph directly correspond to changes in physical movement, helping users intuitively connect raw sensor signals with the resulting activity predictions.









Comments