This is a project made for the Assignment1 of Internet of Things class at the Sapienza University of Rome.
In this assignment, I've created a cloud-based IoT system that collects information from a set of virtual environmental sensors using the MQTT protocol. Furthermore, I have created a simple web site dashboard to display the data collected from the sensors.
The MQTT is managed using the cloud-based backend of Google IoT platform.
The following sections are a hands-on tutorial on how to setup and run the system.
TECHNOLOGY USED: Node.js, MQTT, WebSockets, JQuery, Bootstrap CSS and Handlebars.
The first thing to do is to create an account on Google Cloud Platform, create a "New Project" and enable the Cloud IoT Core and Cloud Pub/Sub APIs.
Once opened the "IoT Core" section follow these simple steps:
1) Create a registry
2) Create devices and add them to the registry (for our purpose 4 devices)
3) Create a subscription and connect it to the devices
Notice that to do the second step you have to create a certificate for every device. This guide, provided by Google, contains all the detailed steps:
Now that our platform is ready we can see how to develop the virtual sensors.Virtual Sensors
I have created a stand-alone program for every sensor that represents a virtual environmental station that generates, periodically, random values for:
- temperature (-50... 50 Celsius)
- humidity (0... 100%)
- wind direction (0... 360 degrees)
- rain height (0... 50 mm / h)
Each virtual sensor will publish these random values on the MQTT Google channel. The scripts for the sensors are based on Node.js and the code is essentially the same for every sensor apart for the values to send, for this reason, I'll show you only the temperature sensor.
Note: I have separated the code of each device for code modularity, but it is also possible to use only a single script.
Here I manage the main Publish/Subscribe process. The first thing to do is to connect the sensor to the Google platform by entering the arguments previously created (on the Set Up Google Cloud Platform section).
If the connection is successful the sensor starts to send the random values (line 100-105). The subsequent functions handle the errors (line 113), the closing connection (line 109) and the other messages subscribing by the platform (line117).
Generate and Publish values
The image above shows how to create and publish the values to the platform. Values are generated by getRndInteger (line 38) and then they are asynchronously published by the function publishAsync (line 43), every 5 seconds.
I used a Quality of Service qos1 (line 43) that is equivalent to the paradigm: at least once. Indeed with this choice, we could have the problem of duplicates but we have also higher reliability than qos0 (at most once) which could lose data. (Note: Google IoT Core doesn't support qos2, exactly once).
Furthermore is important to know how the payload is made:
deviceId ; temperature ; date
This structure allows the web site dashboard to split the message and to know: which device has sent the value, the value itself, and the date. The date is important for the database and the dashboard that will have to show the last hour values received for every sensor. We will see deeper in the next section.
The code for the temperature sensor (and also for the other devices) is available on GitHub repository (see at the end of the page) at the path: devices/temperature/temperature.js
For more details on this part I based my code on the following Google Guide:Web-based Dashboard
The dashboard is a simple web application based on NodeJS, MongoDB, Bootstrap, JQuery and Handlebars. Furthermore, it uses the connection protocol WebSocket to display the values in real time.
The main functionalities of the dashboard are:
- Display the latest values received from all the sensors of a specified environmental station.
- Display the values received during the last hour from all environmental stations of a specified sensor.
The process flow is very simple
When the server starts, it listens for messages from the Pub/Sub service of Google Cloud (line 65) accessing to the subscription created before (line 32). When a message arrives from the platform, the server pull it (line 63) and first of all sends it to the MongoDB database (line 40-45), then shows it in the dashboard through the WebSocket connection (line 47-57). Finally, it sends an ack to the Cloud (line 59). Also this whole process works asynchronously.
For more details on how to pull the messages from the GCP, I based my code on the following Google Guide:
This process, therefore, includes the display of the latest values received. About the values received during the last hour, I had to change the code a bit and insert the management of arrival times.
When the web pages are accessed (or refreshed) it sends a query to the database to retrieve only values that have an arrival date of no more than one hour. As you can see in the code (line 98) we use the Date.now() function which returns the number of milliseconds, elapsed since January-1-1970, divided by 1000 to convert it in seconds. This allows us an easy management of the functionality because we can simply subtract 3600 seconds to have the last hour. In this way even if the sensors stop sending messages we have a list of the last hour values always updated.
About the front-end the most relevant part is how to handle messages arrived from web socket and how to display it:
The first line sends the connection request. When the connection receives a message (payload) the system simply updates the container of the latest value and prepends the new value in the container of the last hour values.
The web dashboard is deployed by Heroku at this link:
The code for the dashboard is available on GitHub repository (see at the end of the page) at the path: dashboard/MongoDB Database
The storage of data is managed by MongoDB. The schema is the following:
To see the system in action, the procedure is very simple:
- Open the web dashboard
- Open the script in four different terminals, install the packages with npm install command, and finally launch the scripts with node