This tutorial is thought to be the starting point where I aim to build an Augmented Reality & IoT app.
In order to do so, first we'll need to start by publishing data into a MQTT broker (Adafruit IO io.adafruit.com).
I've chosen Adafruit IO because its free, allows to send different sorts of data (as numbers, strings, images) and also has a dashboard feature that allows you to see the published data in different ways.
What You'll NeedYou'll need the following:
- A MicroPython powered device (in this case I'm using an Arduino Portenta)
- OpenMV IDE (https://openmv.io/pages/download)
- An account at Adafruit IO (io.adafruit.com)
- Optional: Camera (in this case I'm using the Portenta Vision Shield)
Arduino Portenta allows you to run the code in different ways:
- With Arduino IDE, where once that you have uploaded the code it only needs a power supply to keep running
- With OpenMV IDE, where only runs if its plugged and compiled from there
- With Micropython, where you need to upload a main.py file to the Arduino board. Once that the file has been uploaded and it has a power supply will start running your Python code (similar to Arduino IDE)
This tutorial shows how to run the code from the OpenMV IDE or with MicroPython by uploading the main.py file to the board.
Setting up Adafruit IOGo to io.adafruit.com to register your account, then log in.
Now click on the Feeds tab and create a New Feed. You can name it as you like, in my case I created four feeds and called them image, ledstatus, temperature and text because those are the different data formats that we are going to send later on.
Note: You can skip this step and create the feeds directly from main.py. If you decided to do so, and you are not creating the feeds from Adafruit IO, the program will create them in Adafruit IO by itself. This has pros and cons: main pro is that you might save some time whereas in the other hand could cause some issues because Adafruit IO assigns some standard formats to the feeds. In the particular case of the image feed it is useful to spend some time on Adafruit IO and creating it from there. Images could be sent to the MQTT broker in two different sizes (1 KB or 100KB); if we do not create the image feed in Adafruit IO the program will assign by default the 1KB size. This could crash our code if we send 100KB size images.
Once that you've created the feed you'll be able to create the project dashboard. Keep in mind that if you haven't created any feed, you won't be able to set up the dashboard.
To create a Dashboard, click on Dashboard and after that on New Dashboard. You can choose between a wide range of blocks depending on your specific needs (i.e. for numbers you could choose between the chart, slider, gauge or simply displaying the numbers in a text box).
Important: Do not forget to switch Show history to Live in order to plot the data in real time.
Once that you've finished your Dashboard might look like the one below:
Finally it's time to get the Key that will be required to upload data from the code to Adafruit IO. Click on My Key and copy the active key, which is a string of letters and symbols.
Download OpenMV IDE from https://openmv.io/pages/download. Once that it has been downloaded, install the software and run it.
I'm using OpenMV because helps me to take snapshots of the camera and also comes with a mqtt library that will be required to publish data into Adafruit IO.
Plug in the Arduino Portenta and update board firmware if is required.
Prior starting writing some code, we will run a quick test to be sure that the OpenMV can upload sketches to the Arduino.
In this case the test code blinks Arduino Portenta inbuilt blue LED.
import sensor, image, time, pyb
led = pyb.LED(3) # 1 is for RED, 2 is for green and 3 for blue in built LED
while(True):
led.on()
time.sleep(1)
led.off()
time.sleep(2)
If the test is successful (blue LED turned ON for 1 second and then OFF for 2 seconds) we can move forward to the last step.
Main.pyThis code will:
- Connect to your Wi Fi network (LED green light ON indicates that the connection has been successful)
- Optional: Initialize the camera Vision Shield. In case you do not have a camera skip the rows that start with sensor
- Connect to Adafruit IO (MQTT broker)
- Publish data into Adafruit IO within a set time (each time that the blue LED is ON indicates that data has been published to Adafruit IO)
Note that Adafruit IO sets a maximum publish rate per minute, so you have to keep this in mind to avoid throttling.
Publish rate depends on your plan: 30 data points per minute for the free plan and 60 per minute for IO plus plan.
import time, network, pyb
from mqtt import MQTTClient
import sensor, image, ubinascii
start_led = pyb.LED(2) # green led 2 seconds blink will indicate that Wi Fi connection was successful
working_led = pyb.LED(3) # blue led fast blink will indicate that the data has been sent to MQTT server
sensor.reset() # Reset and initialize the sensor.
sensor.set_pixformat(sensor.GRAYSCALE) # Arduino Vision Shield in monochromatic
sensor.set_framesize(sensor.QQVGA) # Set frame size to QQVGA for reducing size of the image to be sent
sensor.skip_frames(time=2000) # Wait for settings take effect.
SSID = 'your-wifi-network-name'
KEY = 'your-wifi-network-password'
print("Trying to connect. Note this may take a while...") # Init wlan module and connect to network
wlan = network.WLAN(network.STA_IF)
wlan.deinit()
wlan.active(True)
wlan.connect(SSID, KEY, timeout=30000)
print("WiFi Connected ", wlan.ifconfig()) # We should have a valid IP now via DHCP
start_led.on()
time.sleep(2) # Green LED will turn ON for 2 seconds indicating that the connection was successful
start_led.off()
client = MQTTClient("openmv", "io.adafruit.com", user="your-adafruit-io-user", password="your-adafruit-io-key", port=1883)
client.connect()
i = 0
while (True):
i_t = str(i) # data to be published requires to be in a string format
text = "Running for " + i_t + " sec" # i.e. "Running for 450 sec"
if i%2 == 0:
ledstatus = "0"
else:
ledstatus = "1"
img = sensor.snapshot()
img = img.compress(quality=90) # to reduce image size to keep it below 1KB
img_bytes = ubinascii.b2a_base64(img) # converts the image to bytes
img_t = img_bytes.decode("utf-8") # converts bytes to string
img_t = img_t.strip() # removes the " " at the beginning of the string
working_led.on()
client.publish("adafruit-io-user/feeds/temperature", i_t)
client.publish("adafruit-io-user/feeds/text", text)
client.publish("adafruit-io-user/feeds/image", img_t)
client.publish("adafruit-io-user/feeds/ledstatus", ledstatus)
working_led.off()
time.sleep(10)
i += 1
Prior running the code you need to re-write the following lines with your Wi Fi and Adafruit IO personal data:
- SSID = 'your-wifi-network-name'
- KET = 'your-wifi-network-password'
- client = MQTTClient("openmv", "io.adafruit.com", user = "your-adafruit-io-user", password = "your-adafruit-io-key", port = 1883)
- client.publish("adafruit-io-user/feed/temperature", i_t)
- client.publish("adafruit-io-user/feed/text", text)
- client.publish("adafruit-io-user/feed/image", img_t)
Once that you have made those last tiny modifications your main.py file is ready to be uploaded to the Arduino Portenta and start publishing data into Adafruit IO.
Every time that data has been published to Adafruit IO you should be able to see it into the dashboard.
In the next tutorial I'll show how to take data from Adafruit IO to display it into Unity.
Comments