In the rapidly evolving landscape of the Internet of Things (IoT), real-time activity recognition has become a cornerstone of personalized health and fitness tracking. This project involves the design and implementation of a comprehensive, full-stack IoT system tailored for the real-time classification of user activities, such as walking or running. By integrating an RT-Thread RT-Spark development board equipped with an onboard accelerometer and gyroscope, the system captures precise motion data at the edge. This data is then transmitted via Wi-Fi to a cloud-based back-end where a machine learning model—trained to recognize patterns in time-series data—infers the user's current movement. Finally, the results are delivered to a custom-built Android mobile application, providing users with immediate visualizations and historical records of their physical activity.
Key Project Objectives
- Embedded Firmware Development: Programming the RT-Spark board to sample sensor data and manage network communications.
- Intelligent Classification: Selecting and training a lightweight machine learning model capable of extracting temporal correlations from motion sensors.
- Cloud & API Integration: Building a robust back-end to handle real-time predictions and maintain a database of user history.
- Mobile Visualization: Creating an intuitive Android interface using libraries like Volley for seamless interaction with cloud APIs.
To kickstart the development of your wearable activity tracker, the first essential step is gathering and configuring the specific hardware and software components required for a full-stack IoT implementation. This project leverages the RT-Spark development board as the core wearable device, utilizing its integrated ICM20608 sensor to capture motion data and the rw007 module for seamless Wi-Fi connectivity. On the software side, a robust environment comprising RT-Thread Studio for firmware development, Google Firebase for cloud-based data management, and Android Studio for mobile visualization ensures that every layer of the system—from edge sensing to real-time analytics—is properly supported.
Materials List:
Hardware Components
- RT-Spark STM32F406ZGT6: The main microcontroller unit used for embedded system development.
rw007Wi-Fi Module (Embedded): Enables network communication between the wearable and the cloud.ICM20608(Embedded): Provides the 6-axis gyroscope and accelerometer readings necessary for motion classification.
Software & Platforms
- RT-Thread Studio: The IDE for developing the device firmware and managing RT-OS components.
- Google Firebase: Acts as the device management platform and database for storing real-time and historical predictions.
- Android Studio: Used to develop the mobile application for data visualization.
- Visual Studio Code: A versatile editor for back-end scripting, API development, or machine learning model training.
To establish a functional communication link between the hardware sensors and the cloud, you must precisely configure the RT-Thread real-time operating system environment. This step ensures that the ICM20608 sensor and RW007 Wi-Fi module are correctly initialized and mapped to the microcontroller pins.
1. Initializing the Project Environment
The first step is to create a firmware project that leverages the pre-defined hardware abstraction layer for your specific development board.
- Launch RT-Thread Studio and navigate to File > New > RT-Thread Project.
- Critical Selection: In the project wizard, select "Base on Board". This ensures the IDE applies the automated pin configurations specific to the RT-Spark's internal wiring.
Configure the project settings with the following parameters:
- Board:
STM32F407-RT-SPARK - RT-Thread Version: 4.1.1
- Adapter:
ST-LINK
2. Software Package Integration
Once the project structure is generated, you must import the specific drivers (Software Packages) required to interface with the embedded sensors and the network transceiver.
- Open the RT-Thread Settings file located in the Project Explorer (left panel).
- Navigate to the SoftPackage tab and click the "Add" button.
- Search for and add the following two packages:
icm20608: The driver for the 6-axis gyroscope and accelerometer.rw07: The driver for the high-speed SPI Wi-Fi module.
Once you have successfully added the required software packages, you must configure the driver parameters to ensure the hardware communicates correctly with the RT-Thread OS. This step is critical because motion detection is a pattern recognition task on time-series data, and improper sampling or bus assignments will lead to data loss.
To begin, locate the SoftPackage section in your RT-Thread Settings, hover over the specific package (icm20608 or rw007), and click the "config" button. This will open the detailed configuration interface.
Hardware Driver Alignment
- Sensor Data Integrity: When configuring the
icm20608, ensure the sampling rate is balanced. A rate that is too low (e.g., once every 20 seconds) will lose vital temporal information, while a rate that is too high will generate unnecessary network traffic and overhead. - SPI Bus Assignment: The
rw007Wi-Fi module and theicm20608sensor both utilize SPI communication. You must verify that the SPI bus and pin assignments match the physical layout of the RT-Spark board to avoid resource conflicts or deadlocks. - Device Management Prep: These configurations allow the board to acquire sensor readings and periodically transmit them to your device management platform via Wi-Fi.
Note: Please follow the exact configurations shown in the reference images below. Maintaining the "temporal correlation" in your data is essential for the machine learning model to accurately classify activities like walking or running.
With the hardware pins and drivers correctly configured in RT-Thread Studio, the system is now physically primed to interact with the environment. This step shifts the focus from static configuration to dynamic execution, where you will develop the embedded firmware necessary to transform raw sensor signals into actionable data.
This phase involves writing the logic to sample the icm20608 accelerometer and gyroscope at precise intervals to maintain temporal integrity, ensuring the patterns of movement—like the rhythmic cadence of walking or the high-impact bursts of running—are preserved for the machine learning model. Simultaneously, you will implement the network stack for the rw007 module to transmit these readings to your cloud backend with minimal latency. Through iterative debugging and console monitoring, you will verify that the data flow is robust and the communication schemes are free from the deadlocks or resource conflicts that can arise in complex IoT systems.
Replace your main.c with this:
/*
* File: main.c
* Purpose: Wi-Fi Connection + Full Sensor Data Display (Accel & Gyro)
*/
#include <rtthread.h>
#include <rtdevice.h>
#include <board.h>
#include <stdlib.h> /* For abs() */
#include <drv_spi.h> /* For SPI attachment */
/* Networking Headers */
#include <sys/socket.h>
#include <arpa/inet.h>
#include <netdev.h>
/* Sensor and Wi-Fi Drivers */
#include "icm20608.h"
#include "spi_wifi_rw007.h"
/* --- CONFIGURATION --- */
#define WIFI_SSID "PUT YOUR WIFI NAME HERE [SHOULD NOT BE A 5G WIFI]"
#define WIFI_PASSWORD "PUT YOUR WIFI PASSWORD HERE"
/* NOTE: I lowered this to 1000. 3000 is very hard to trigger by hand. */
#define MOTION_THRESHOLD 1000
#define SENSOR_I2C_BUS "i2c2"
/* --- SENSOR THREAD --- */
static void sensor_thread_entry(void *parameter)
{
icm20608_device_t dev;
/* Variables for Accelerometer */
rt_int16_t ax, ay, az;
rt_int16_t lax = 0, lay = 0; /* Last readings (for logic) */
/* Variables for Gyroscope */
rt_int16_t gx, gy, gz;
rt_kprintf(">> [SENSOR] Initializing ICM20608 on %s...\n", SENSOR_I2C_BUS);
/* 1. Initialize the sensor */
dev = icm20608_init(SENSOR_I2C_BUS);
if (dev == RT_NULL) {
rt_kprintf(">> [ERROR] Sensor init FAILED! Check connection.\n");
return;
}
/* Get baseline reading */
icm20608_get_accel(dev, &lax, &lay, &az);
while (1)
{
/* 2. Read BOTH Sensors */
rt_err_t res_accel = icm20608_get_accel(dev, &ax, &ay, &az);
rt_err_t res_gyro = icm20608_get_gyro(dev, &gx, &gy, &gz);
if (res_accel == RT_EOK && res_gyro == RT_EOK)
{
/* 3. Logic: Calculate Movement */
int delta_x = abs(ax - lax);
int delta_y = abs(ay - lay);
/* 4. Formatting: Combine Logic with Data Display */
if (delta_x > MOTION_THRESHOLD || delta_y > MOTION_THRESHOLD)
{
/* Motion Detected - Print with ALERT tag */
rt_kprintf(">> [ALERT] MOTION! | Accel(X:%d Y:%d Z:%d) | Gyro(X:%d Y:%d Z:%d)\n",
ax, ay, az, gx, gy, gz);
}
else
{
/* Quiet - Print with INFO tag */
rt_kprintf(">> [INFO ] Quiet... | Accel(X:%d Y:%d Z:%d) | Gyro(X:%d Y:%d Z:%d)\n",
ax, ay, az, gx, gy, gz);
}
/* Update "Last" values */
lax = ax;
lay = ay;
}
else
{
rt_kprintf(">> [ERROR] Failed to read sensor data.\n");
}
/* Delay 500ms so you can actually read the scrolling text */
rt_thread_mdelay(500);
}
}
/* --- MAIN FUNCTION --- */
int main(void)
{
/* Clear terminal */
rt_kprintf("\033[2J\033[H--- WiFi & Sensor Priority Boot ---\n");
/* 1. Initialize WiFi Hardware */
rt_hw_spi_device_attach("spi2", "spi20", GET_PIN(B, 12), RT_NULL);
rt_thread_mdelay(200);
/* 2. Connect to WiFi */
/* Removed #ifdef so this ALWAYS runs */
rt_kprintf("Connecting to WiFi: %s\n", WIFI_SSID);
rt_wlan_connect(WIFI_SSID, WIFI_PASSWORD);
/* Wait for connection (5s) */
rt_thread_mdelay(5000);
struct netdev *net_dev = netdev_get_by_name("wlan0");
if (net_dev && netdev_is_up(net_dev)) {
rt_kprintf("WiFi Connected! IP: %s\n", inet_ntoa(net_dev->ip_addr));
} else {
rt_kprintf("DHCP still working... Use 'ifconfig' to check later.\n");
}
rt_kprintf("--------------------------------------------------\n");
/* 3. Start Sensor Thread */
rt_thread_t tid = rt_thread_create("motion", sensor_thread_entry, RT_NULL, 2048, 25, 10);
if (tid) rt_thread_startup(tid);
return RT_EOK;
}Locate your rtconfig.h file in the Project Explorer and open it. Then replace it with this code.
#ifndef RT_CONFIG_H__
#define RT_CONFIG_H__
/* Generated by Kconfiglib (https://github.com/ulfalizer/Kconfiglib) */
/* RT-Thread Kernel */
#define RT_NAME_MAX 8
#define RT_ALIGN_SIZE 4
#define RT_THREAD_PRIORITY_32
#define RT_THREAD_PRIORITY_MAX 32
#define RT_TICK_PER_SECOND 1000
#define RT_USING_OVERFLOW_CHECK
#define RT_USING_HOOK
#define RT_HOOK_USING_FUNC_PTR
#define RT_USING_IDLE_HOOK
#define RT_IDLE_HOOK_LIST_SIZE 4
#define IDLE_THREAD_STACK_SIZE 1024
/* kservice optimization */
/* end of kservice optimization */
#define RT_DEBUG
#define RT_DEBUG_COLOR
/* Inter-Thread communication */
#define RT_USING_SEMAPHORE
#define RT_USING_MUTEX
#define RT_USING_EVENT
#define RT_USING_MAILBOX
#define RT_USING_MESSAGEQUEUE
/* end of Inter-Thread communication */
/* Memory Management */
#define RT_USING_MEMPOOL
#define RT_USING_SMALL_MEM
#define RT_USING_SMALL_MEM_AS_HEAP
#define RT_USING_HEAP
/* end of Memory Management */
/* Kernel Device Object */
#define RT_USING_DEVICE
#define RT_USING_CONSOLE
#define RT_CONSOLEBUF_SIZE 128
#define RT_CONSOLE_DEVICE_NAME "uart1"
/* end of Kernel Device Object */
#define RT_VER_NUM 0x40101
/* end of RT-Thread Kernel */
#define ARCH_ARM
#define RT_USING_CPU_FFS
#define ARCH_ARM_CORTEX_M
#define ARCH_ARM_CORTEX_M4
/* RT-Thread Components */
#define RT_USING_COMPONENTS_INIT
#define RT_USING_USER_MAIN
#define RT_MAIN_THREAD_STACK_SIZE 2048
#define RT_MAIN_THREAD_PRIORITY 10
#define RT_USING_MSH
#define RT_USING_FINSH
#define FINSH_USING_MSH
#define FINSH_THREAD_NAME "tshell"
#define FINSH_THREAD_PRIORITY 20
#define FINSH_THREAD_STACK_SIZE 4096
#define FINSH_USING_HISTORY
#define FINSH_HISTORY_LINES 5
#define FINSH_USING_SYMTAB
#define FINSH_CMD_SIZE 80
#define MSH_USING_BUILT_IN_COMMANDS
#define FINSH_USING_DESCRIPTION
#define FINSH_ARG_MAX 10
/* Device Drivers */
#define RT_USING_DEVICE_IPC
#define RT_USING_SYSTEM_WORKQUEUE
#define RT_SYSTEM_WORKQUEUE_STACKSIZE 2048
#define RT_SYSTEM_WORKQUEUE_PRIORITY 23
#define RT_USING_SERIAL
#define RT_USING_SERIAL_V1
#define RT_SERIAL_USING_DMA
#define RT_SERIAL_RB_BUFSZ 64
#define RT_USING_I2C
//#define RT_I2C_DEBUG
#define RT_USING_I2C_BITOPS
#define RT_USING_PIN
#define RT_USING_SPI
#define RT_USING_SENSOR
#define RT_USING_SENSOR_CMD
#define RT_USING_WIFI
#define RT_USING_WLAN
#define RT_WLAN_DEVICE_STA_NAME "wlan0"
#define RT_WLAN_DEVICE_AP_NAME "wlan1"
#define RT_WLAN_SSID_MAX_LENGTH 32
#define RT_WLAN_PASSWORD_MAX_LENGTH 32
#define RT_WLAN_DEV_EVENT_NUM 2
#define RT_WLAN_MANAGE_ENABLE
#define RT_WLAN_SCAN_WAIT_MS 10000
#define RT_WLAN_CONNECT_WAIT_MS 10000
#define RT_WLAN_SCAN_SORT
#define RT_WLAN_MSH_CMD_ENABLE
#define RT_WLAN_AUTO_CONNECT_ENABLE
#define AUTO_CONNECTION_PERIOD_MS 2000
#define RT_WLAN_CFG_ENABLE
#define RT_WLAN_CFG_INFO_MAX 3
#define RT_WLAN_PROT_ENABLE
#define RT_WLAN_PROT_NAME_LEN 8
#define RT_WLAN_PROT_MAX 2
#define RT_WLAN_DEFAULT_PROT "lwip"
#define RT_WLAN_PROT_LWIP_ENABLE
#define RT_WLAN_PROT_LWIP_NAME "lwip"
#define RT_WLAN_WORK_THREAD_ENABLE
#define RT_WLAN_WORKQUEUE_THREAD_NAME "wlan"
#define RT_WLAN_WORKQUEUE_THREAD_SIZE 2048
#define RT_WLAN_WORKQUEUE_THREAD_PRIO 15
/* Using USB */
/* end of Using USB */
/* end of Device Drivers */
/* C/C++ and POSIX layer */
#define RT_LIBC_DEFAULT_TIMEZONE 8
/* POSIX (Portable Operating System Interface) layer */
/* Interprocess Communication (IPC) */
/* Socket is in the 'Network' category */
/* end of Interprocess Communication (IPC) */
/* end of POSIX (Portable Operating System Interface) layer */
/* end of C/C++ and POSIX layer */
/* Network */
#define RT_USING_SAL
#define SAL_INTERNET_CHECK
/* Docking with protocol stacks */
#define SAL_USING_LWIP
/* end of Docking with protocol stacks */
#define SAL_SOCKETS_NUM 16
#define RT_USING_NETDEV
#define NETDEV_USING_IFCONFIG
#define NETDEV_USING_PING
#define NETDEV_USING_NETSTAT
#define NETDEV_USING_AUTO_DEFAULT
#define NETDEV_IPV4 1
#define NETDEV_IPV6 0
#define RT_USING_LWIP
#define RT_USING_LWIP203
#define RT_USING_LWIP_VER_NUM 0x20003
#define RT_LWIP_MEM_ALIGNMENT 4
#define RT_LWIP_IGMP
#define RT_LWIP_ICMP
#define RT_LWIP_DNS
#define RT_LWIP_DHCP
#define IP_SOF_BROADCAST 1
#define IP_SOF_BROADCAST_RECV 1
/* Static IPv4 Address */
#define RT_LWIP_IPADDR "192.168.1.30"
#define RT_LWIP_GWADDR "192.168.1.1"
#define RT_LWIP_MSKADDR "255.255.255.0"
/* end of Static IPv4 Address */
#define RT_LWIP_UDP
#define RT_LWIP_TCP
#define RT_LWIP_RAW
#define RT_MEMP_NUM_NETCONN 8
#define RT_LWIP_PBUF_NUM 16
#define RT_LWIP_RAW_PCB_NUM 4
#define RT_LWIP_UDP_PCB_NUM 4
#define RT_LWIP_TCP_PCB_NUM 4
#define RT_LWIP_TCP_SEG_NUM 40
#define RT_LWIP_TCP_SND_BUF 8196
#define RT_LWIP_TCP_WND 8196
#define RT_LWIP_TCPTHREAD_PRIORITY 10
#define RT_LWIP_TCPTHREAD_MBOX_SIZE 8
#define RT_LWIP_TCPTHREAD_STACKSIZE 1024
#define RT_LWIP_ETHTHREAD_PRIORITY 12
#define RT_LWIP_ETHTHREAD_STACKSIZE 1024
#define RT_LWIP_ETHTHREAD_MBOX_SIZE 8
#define LWIP_NETIF_STATUS_CALLBACK 1
#define LWIP_NETIF_LINK_CALLBACK 1
#define SO_REUSE 1
#define LWIP_SO_RCVTIMEO 1
#define LWIP_SO_SNDTIMEO 1
#define LWIP_SO_RCVBUF 1
#define LWIP_SO_LINGER 0
#define LWIP_NETIF_LOOPBACK 0
#define RT_LWIP_USING_PING
/* end of Network */
/* Utilities */
/* end of Utilities */
/* end of RT-Thread Components */
/* RT-Thread online packages */
/* IoT - internet of things */
/* Wi-Fi */
/* Marvell WiFi */
/* end of Marvell WiFi */
/* Wiced WiFi */
/* end of Wiced WiFi */
#define PKG_USING_RW007
#define PKG_USING_RW007_V201
#define RW007_USING_STM32_DRIVERS
#define RW007_SPI_MAX_HZ 30000000
#define RW007_SPI_BUS_NAME "spi2"
#define RW007_CS_PIN 90
#define RW007_BOOT0_PIN 29
#define RW007_BOOT1_PIN 90
#define RW007_INT_BUSY_PIN 107
#define RW007_RST_PIN 111
/* end of Wi-Fi */
/* IoT Cloud */
/* end of IoT Cloud */
/* end of IoT - internet of things */
/* security packages */
/* end of security packages */
/* language packages */
/* JSON: JavaScript Object Notation, a lightweight data-interchange format */
/* end of JSON: JavaScript Object Notation, a lightweight data-interchange format */
/* XML: Extensible Markup Language */
/* end of XML: Extensible Markup Language */
/* end of language packages */
/* multimedia packages */
/* LVGL: powerful and easy-to-use embedded GUI library */
/* end of LVGL: powerful and easy-to-use embedded GUI library */
/* u8g2: a monochrome graphic library */
/* end of u8g2: a monochrome graphic library */
/* PainterEngine: A cross-platform graphics application framework written in C language */
/* end of PainterEngine: A cross-platform graphics application framework written in C language */
/* end of multimedia packages */
/* tools packages */
/* end of tools packages */
/* system packages */
/* enhanced kernel services */
/* end of enhanced kernel services */
/* acceleration: Assembly language or algorithmic acceleration packages */
/* end of acceleration: Assembly language or algorithmic acceleration packages */
/* CMSIS: ARM Cortex-M Microcontroller Software Interface Standard */
/* end of CMSIS: ARM Cortex-M Microcontroller Software Interface Standard */
/* Micrium: Micrium software products porting for RT-Thread */
/* end of Micrium: Micrium software products porting for RT-Thread */
/* end of system packages */
/* peripheral libraries and drivers */
#define PKG_USING_SENSORS_DRIVERS
#define PKG_USING_ICM20608
#define PKG_USING_ICM20608_LATEST_VERSION
/* Kendryte SDK */
/* end of Kendryte SDK */
/* end of peripheral libraries and drivers */
/* AI packages */
/* end of AI packages */
/* miscellaneous packages */
/* project laboratory */
/* end of project laboratory */
/* samples: kernel and components samples */
/* end of samples: kernel and components samples */
/* entertainment: terminal games and other interesting software packages */
/* end of entertainment: terminal games and other interesting software packages */
/* end of miscellaneous packages */
/* Arduino libraries */
/* Projects */
/* end of Projects */
/* Sensors */
/* end of Sensors */
/* Display */
/* end of Display */
/* Timing */
/* end of Timing */
/* Data Processing */
/* end of Data Processing */
/* Data Storage */
/* Communication */
/* Device Control */
/* end of Device Control */
/* Other */
/* Signal IO */
/* end of Signal IO */
/* Uncategorized */
/* end of Arduino libraries */
/* end of RT-Thread online packages */
#define SOC_FAMILY_STM32
#define SOC_SERIES_STM32F4
/* Hardware Drivers Config */
#define SOC_STM32F407ZG
#define BOARD_STM32F407_SPARK
/* Onboard Peripheral Drivers */
#define BSP_USING_USB_TO_USART
#define BSP_USING_ICM20608
/* end of Onboard Peripheral Drivers */
/* On-chip Peripheral Drivers */
#define BSP_USING_GPIO
#define BSP_USING_UART
#define BSP_USING_UART1
#define BSP_USING_SPI
#define BSP_USING_SPI2
#define BSP_USING_I2C
#define BSP_USING_I2C1
#define BSP_I2C1_SCL_PIN 24
#define BSP_I2C1_SDA_PIN 25
#define BSP_USING_I2C2
#define BSP_I2C2_SCL_PIN 81
#define BSP_I2C2_SDA_PIN 80
/* end of On-chip Peripheral Drivers */
/* Board extended module Drivers */
/* end of Board extended module Drivers */
/* end of Hardware Drivers Config */
#endifStep 5: Machine Learning Model DevelopmentWith the cloud database operational and the hardware primed for data transmission, the focus shifts to the "brain" of the system: the Machine Learning model. This phase involves transitioning from embedded C to Python to build a deep learning classifier. By leveraging the TensorFlow framework, you will create a model capable of recognizing complex patterns in raw accelerometer and gyroscope data to distinguish between activities like walking and running.
1. Environment and Dataset Preparation
Before training, you must establish a data science environment on your local workstation.
- Acquire Training Data: Download a Human Activity Recognition (HAR) dataset in
.csvformat from Kaggle or through the GitHub link found in the attachments below. This file should contain time-series readings for various physical activities. - Install Python: Ensure Python 3.x is installed on your system.
- Install AI Frameworks: Open your terminal or command prompt and install the required machine learning libraries by executing:
pip install tensorflow pandas scikit-learn - Project Organization: In Visual Studio Code (VSC), create a dedicated project folder named
ActivityTracker. Move your downloaded Kaggle.csvfile into this folder.
2. Model Training Logic
The training script will process the raw sensor data, normalize it for consistency, and train a neural network to identify activity-specific signatures.
- Create a new file in your
ActivityTrackerfolder namedtrain_model.py. - Implementation Note: Ensure the script is located in the same directory as your dataset to allow for seamless data loading.
- Paste the provided TensorFlow training code into
train_model.pyand save the file.
3. Executing the Training Script
Once the code and data are in place, you can initiate the training process via the terminal.
- Open PowerShell or your preferred terminal.
- Navigate to your project directory by typing this on your terminal:
cd Desktop\ActivityTracker - Run the training script:
python train_model.py
4. Verifying Artifacts
Upon successful completion of the script, your folder will now contain two critical "artifacts" required for the next phase of the project:
activity_model_tf.h5: The trained TensorFlow model file containing the neural network's architecture and learned weights.scaler.pkl: A "Pickle" file containing the data scaling parameters used during training. This ensures that live data from the RT-Spark board is normalized exactly like the training data before being classified.
Note: Make sure that the two files are saved within the file where "server_final.py" was located.Step 6: Cloud Integration via Google Firebase
With the embedded hardware successfully transmitting data, the next phase involves establishing the Device Management Platform and cloud infrastructure. For this project, Google Firebase serves as the central hub, acting as the real-time database that stores incoming sensor readings and predicted user activities for retrieval by the mobile application.
1. Initializing the Firebase Project
To begin, you must create a dedicated project environment within the Firebase console to manage your IoT data.
- Navigate to the Firebase Console and click "Add project."
- Follow the on-screen prompts by clicking "Continue" until you reach the "Select an account" step.
- Choose an existing Google Analytics account or create a new one, then select "Create project" to provision your cloud resources.
2. Provisioning the Realtime Database
Since activity tracking is a time-sensitive task, a Realtime Database is required to ensure predictions are updated with negligible latency.
- In the left-hand navigation panel, expand the "Build" category and select "Realtime Database."
- Click "Create Database" and follow the setup wizard.
- Crucial Step: Once created, locate and copy the Database URL (e.g.,
https://your-project-id.firebaseio.com/). This URL is required for your backend server to communicate with the cloud.
3. Security and Authentication
To allow your backend server (e.g., Node.js or Python) to securely write data to Firebase, you must generate a private service account key.
- Click the Gear Icon next to "Project Overview" and select "Project Settings."
- Navigate to the "Service accounts" tab.
- Click the "Generate new private key" button, then confirm by clicking "Generate key." This will download a JSON file containing your credentials.
- Backend Linkage: Paste your previously copied Database URL into your Node.js or Python server configuration file under the
databaseURLfield.
4. Local Environment Setup
Finally, you must install the necessary administrative libraries to allow your development environment to interact with the Firebase API.
- Open the Visual Studio Code (VSC) Terminal.
- Execute the following command to install the Firebase Admin SDK:
pip install firebase-admin - Open notepad and paste this code below. Make sure to name it
server_final.py
import firebase_admin
from firebase_admin import credentials, db
from flask import Flask, request, jsonify
import datetime
import pandas as pd
import tensorflow as tf
import joblib
import json
import threading
# --- CONFIGURATION ---
# Ensure 'firebase_key.json' is in the same folder as this script
cred = credentials.Certificate("firebase_key.json")
firebase_admin.initialize_app(cred, {
'databaseURL': 'https://finalproject-6f2e1-default-rtdb.firebaseio.com/'
})
app = Flask(__name__)
# --- GLOBAL VARIABLES ---
# This acts as the bridge between RT-Spark and Android Emulator
current_system_status = {
"activity": "Waiting for Spark...",
"confidence": "0.00",
"last_ip": "None",
"timestamp": "0"
}
# --- LOAD AI MODEL ---
print("⏳ Loading AI Model...")
try:
model = tf.keras.models.load_model('activity_model_tf.h5')
scaler = joblib.load('scaler.pkl')
print("✅ Model & Scaler Loaded.")
except Exception as e:
print(f"❌ Error loading model: {e}")
model = None
# --- FIREBASE UPLOAD FUNCTION (Runs in background) ---
def upload_task(packet):
try:
ref = db.reference('sensor_logs')
ref.push(packet)
# print("✅ Saved to Firebase") # Uncomment if you want to see every save
except Exception as e:
print(f"⚠️ Firebase Upload Failed: {e}")
# --- MAIN ENDPOINT: RECEIVE DATA FROM RT-SPARK ---
@app.route('/data', methods=['POST'])
def receive_data():
global current_system_status
client_ip = request.remote_addr
try:
# 1. Parse Data
raw_text = request.get_data(as_text=True)
input_json = json.loads(raw_text)
# 2. Scale Data (RT-Spark Raw -> G-Force)
# Assuming MPU6050 defaults: Accel / 16384, Gyro / 131
ax = float(input_json.get('ax', 0)) / 16384.0
ay = float(input_json.get('ay', 0)) / 16384.0
az = float(input_json.get('az', 0)) / 16384.0
gx = float(input_json.get('gx', 0)) / 131.0
gy = float(input_json.get('gy', 0)) / 131.0
gz = float(input_json.get('gz', 0)) / 131.0
# 3. AI Prediction
predicted_label = "Unknown"
prediction_prob = 0.0
if model:
# Prepare data for model
features = ['acceleration_x', 'acceleration_y', 'acceleration_z', 'gyro_x', 'gyro_y', 'gyro_z']
df = pd.DataFrame([[ax, ay, az, gx, gy, gz]], columns=features)
X_scaled = scaler.transform(df)
# Predict
prob = float(model.predict(X_scaled, verbose=0)[0][0])
predicted_label = "Running" if prob > 0.5 else "Walking"
# Filter Standby (If gravity is the only force)
total_accel = abs(ax) + abs(ay) + abs(az)
total_gyro = abs(gx) + abs(gy) + abs(gz)
# If acceleration is near 1G (approx 0.9 to 1.1) and rotation is low
if 0.8 < total_accel < 1.2 and total_gyro < 2.0:
predicted_label = "Standby"
prediction_prob = 0.0
else:
prediction_prob = prob
# 4. UPDATE DASHBOARD STATUS (Add raw values here!)
current_system_status["activity"] = predicted_label
current_system_status["confidence"] = f"{prediction_prob:.2f}"
current_system_status["last_ip"] = client_ip
current_system_status["timestamp"] = str(datetime.datetime.now())
# ADD THESE LINES so the Android Charts work:
current_system_status["ax"] = ax
current_system_status["ay"] = ay
current_system_status["az"] = az
current_system_status["gx"] = gx
current_system_status["gy"] = gy
current_system_status["gz"] = gz
print(f"⚡ RT-Spark says: {predicted_label} ({prediction_prob:.2f})")
# 5. UPLOAD TO FIREBASE (Background Thread)
upload_packet = {
"timestamp": str(datetime.datetime.now()),
"device_ip": client_ip,
"ax": ax, "ay": ay, "az": az,
"gx": gx, "gy": gy, "gz": gz,
"activity": predicted_label,
"confidence": prediction_prob
}
# We use a thread so the Spark doesn't have to wait for Firebase
thread = threading.Thread(target=upload_task, args=(upload_packet,))
thread.start()
return jsonify({"status": "success", "prediction": predicted_label}), 200
except Exception as e:
print(f"❌ Server Error: {e}")
return jsonify({"error": str(e)}), 500
# --- DASHBOARD ENDPOINT: READ STATUS FROM ANDROID ---
@app.route('/status', methods=['GET'])
def get_status():
# The Emulator calls this to see the latest Spark data
return jsonify(current_system_status), 200
# --- HISTORY ENDPOINT: OPTIONAL (For App History Screen) ---
@app.route('/history', methods=['GET'])
def get_history():
try:
# Retrieve last 20 records from Firebase
ref = db.reference('sensor_logs')
snapshot = ref.order_by_key().limit_to_last(20).get()
history_list = []
if snapshot:
for key, val in snapshot.items():
history_list.append({
"id": key,
"activity": val.get("activity", "Unknown"),
"timestamp": val.get("timestamp", "")
})
# Reverse to show newest first
return jsonify(list(reversed(history_list))), 200
except Exception as e:
return jsonify([]), 200
if __name__ == '__main__':
# host='0.0.0.0' is required for External Devices (Spark) and Emulator (10.0.2.2)
app.run(host='0.0.0.0', port=5000)The "artifacts" that you obtained from the previous step should be included on the folder where server_final was located. Step 7: Mobile Application Development and DeploymentThe final phase of the project involves developing the user-facing interface: a custom Android application. This application serves as the visualization layer, fetching real-time activity predictions and historical data from your Firebase database and cloud backend. By utilizing Android Studio and the Volley HTTP library, the app provides a seamless and responsive user experience, allowing users to monitor their physical activity directly from their smartphones.
1. Initializing the Android Project
To begin development, you must set up the project structure and an emulation environment to test the mobile interface.
- Launch Android Studio and select "New Project."
- In the project templates, select "Phone and Tablet" followed by "Empty Activity" to ensure a clean starting point.
- Name your project (e.g.,
ActivityTrackerMobile) and click "Finish" to initialize the workspace.
2. Emulated Device Configuration
Since a physical Android device may not always be available, you can use a virtual device to test your application’s UI and connectivity.
- Locate the Device Manager icon (represented by a phone and Android icon) on the right-hand panel.
- Click the "+" (Add a new device) icon.
- Select "Remote Devices" and choose a phone model (e.g., Motorola Moto G7 or a similar standard model).
- Once the device is added, click the "Start" (Play) button. The virtual phone will appear under the "Running Devices" tab below the Device Manager.
3. Integrating Source Code via GitHub
Rather than building from scratch, you will pull the pre-configured application logic and UI components from a central repository, accessible via the attached GitHub link in the "Attachments" section, or by clicking this.
- Connect your GitHub account to Android Studio via the settings menu.
- Navigate to the Main Menu (the three-line icon next to the Android logo) and select Git > Pull.
- Ensure the branch is set to
mainand click "Pull" to sync the project files, including the Volley API integrations and Firebase listeners.
4. Execution and Testing
With the code synced, you can now deploy the application to your emulated device.
- Click the green "Run 'app'" button in the top toolbar, or use the keyboard shortcut Shift + F10.
- Once the application launches on the emulator, you will be prompted with a login screen.
- Enter a username and password to enter the main dashboard. Here, you can verify that the app correctly displays real-time activity predictions (e.g., "Standby" or "Running") and historical records retrieved from your cloud database.
This step explains how the mobile application functions as a remote monitor to provide live updates without relying on the smartphone's internal sensors.
1. The Dashboard (Android App) Role: The application serves strictly as the user interface and viewer. It does not use its own sensors; instead, it acts as a remote monitor for the data being processed in the cloud.
2. Network Polling Connection: To maintain a real-time display, the app constantly asks the server for the latest status using HTTP GET requests via the Volley library.
- Polling Frequency: The app is configured to poll the server every 300ms.
- Data Retrieval: The server replies with the latest activity classification and raw sensor values.
- Live Visualization: The app uses these replies to feed and draw live charts and gauges directly on the UI.
After you finish all of these steps and processes, you can check this video as your guide to what you should expect after this project by simply clicking this.
__________________________________________________
Congratulations! You have completed the full-stack implementation. Your system now successfully captures motion on the RT-Spark board, classifies it in the cloud using TensorFlow, and visualizes the results on a mobile app.
__________________________________________________
For source codes, click this.










Comments