Realtime detection of earthquake is very important. It may saves valuable lives. In this project, we have developed a earthquake detection system. Our system consists of two parts, a mobile application which is used for notifying or monitoring the earthquake, and an embedded system (physical device) which is use for detecting earthquakes. We need to placed the physical device on earth surface for sensing the vibration pattern during earthquake.
The device is developed using quickLogic feather development board. A WIFI module (ESP32 development board) is used for transferring earthquake (vibration) data to the cloud. The cloud broadcasts the data (earthquake data) to all connected devices (mobile apps).
The prototype system consists of several components listed in below:
- QuickFeather by QuickLogic
- Firebeetle ESP32 by DFrobot
With several other software packages below.
- QORC-SDK by QuickLogic
- Arduino IDE
- Data Capture Lab, Analytics Studio, open gateway by SensiML
The connection between QuickFeather development board and ESP32 board is using UART with a simple schematic illustrated in Figure (below):
(Note: The SensiML Toolkit documentation provides guidance for installing and using the Data Collection Lab and Analytics Studio that are used in this project. Always refer that documentation for any details.)
Download the binary file for audio data capture from the SensiML documentation link for the QuickFeather device and follow the tutorial to install it. Capture sample data to get samples of ambient (normal) and earthquake (vibration) audio using the configuration settings shown below.
After capturing the good amount of samples label them as ambient and earthquake. Here we use mobile vibration pattern as earthquake sample data.
Data preparation and build modelAfter using Data Capture Lab (DCL) to capture a reasonable amount of ambient and earthquake audio samples and adding the requisite labels then the audio files are uploaded to the SensiML Analytics Studio for data prep and modeling. Below are the sample audio segments retrieved from the DCL audio files.
We build the model using the Build Model section of the Analytics Studio (Refer the SensiML Toolkit documentation for details).
The "Explore Model" section of the SensiML Analytics Studio is helpful for evaluating the developed model. The Confusion Matrix of our developed model is given below:
Once the model is ready, the next step is to create a Knowledge Pack that can be deployed back to the QuickFeather for inference in the wild. In the "Download Model" section of the Studio, confirm the desired settings for the deployment package and select "Download". (Again, refer to the SensiML documentation for details on the different configuration options.)
We create the knowledge pack for deploying the model on QuickFeather board. We download the model from the sensiML Studio. (Again, refer to the SensiML documentation for details on the different configuration options.)
Prepare The ESP32 Board and Thinger.io CloudESP32 development board is used for collecting data from QuickFeather board and responsible for transferring collected data to the cloud. We prepared ESP32 board by flashing it using Arduino IDE (Load a source code on it).
(Refer the https://docs.thinger.io/ for cloud preparation, device connection and endpoint creation in thinger.io), Those videos is also helpful for learning thinger.io
We have developed a mobile application for the Android platform to monitor earthquake data coming from physical devices through the cloud.
Demonstration** Android version of APK is available in source section (play with it if you want).
Reference
Comments