The purpose of this project was to get a SigFox message from a Pycom SiPy to the SigFox backend. Additionally it was to get Azure IoT Hub back to a python script on my computer, which then would make it possible to use as part of a bigger project.
First, I assume that you have access to SigFox backend and the Azure platform.
Using SigFoxOn the SigFox backend, click on "DEVICE TYPE" at the top and then "CALLBACKS" in the side menu. Then click "New" at the almost top right of the screen.
Here you have some different options. I will choose "Microsoft Azure™ IoT Hub" for this project. Click on that.
Now we need information from the Azure platform. Go there, click "New" at the left of the dashboard, and create an IoT hub. If you need assistance with this part, I suggest Microsofts own documentation.
Connection StringTo find the Connection string go to your newly created IoT Hub, click on "Shared Access Policies", choose "iothubowner" and get the "Connection string - primary key". Copy this and use that on the SigFox IoT Hub configuration page. If you need help with this I suggest clicking on the small "?" next to the "Connection String" field in the SigFox IoT Hub configuration page. It has a nice pictured step-by-step.
MessagingThe next step is to define the package going with each message from Sigfox to IoT Hub. This is done in the JSON body field. I ended up with this:
Try and send a message with your Sigfox device and see if it ends up at the Iot Hub. You should end up with something like this:
Now we are done with the Sigfox part. On to the Azure dashboard.
Azure DashboardWe need three things running on the Azure portal: IoT Hub, Stream Analytics Job, and a Service Bus. The way to look at it is, the IoT Hub receives the messages, the Stream Analytics Job redirects them to where they need to go, and then finally the Service Bus is an available endpoint out into the world.
We already got the IoT Hub, so go ahead and create the other two. It follows the same path as with the creation of the IoT Hub.
Now click on your newly created Stream Analytics Job. The way this works is that you have an input, you do something with it (Query), and then send it to an output.
Input and OutputClick on "Input" and then "Add". Fill out the required information. In Source you can use "Event Hub" and in "Import option" you should be able to choose "Use event hub from current subscription". "Event serialization format" is JSON.
Now click on "Outputs" and "Add". Fill out the required information. It should be possible to choose "Use ... from current subscription" whenever needed. Under "Sink" choose "Service Bus Topic". After adding it, click on the new Output. Under "Import Option" choose "Use topic from current subscription".
Now we need to combine the two. This is done in the Query part. Click on this.
Your Inputs and Outputs should show up on the left hand side. The query patterns are written in an SQL-like language, so if you know SQL this should be a piece of cake. You can also test that the connections work. I would suggest doing this. It took me a couple of tries to get the Inputs and Outputs configured completely correctly. The example query seen below simply takes all information from the input and passes it on to the output.
When done it should look something like this:
If everything works, and your testing was successful. Remember to start your Stream Analytics job by pressing the "Start" button.
Service BusNext part is using the Service bus and connecting to it from a Python script. Click on your Service Bus and then the "Shared Access policies". These are needed for the Python script.
But first you need a package to get the information from the Azure Servicebus into Python. Luckily, Microsoft provides such a package.
Next up, start to write the script.
ScriptFirst, import some libraries.
from azure.servicebus import ServiceBusService, Message, Queue
import time
import json
import re
Next configure your bus_service. This is where you need the information from your Service Bus "Shared Access policies".
bus_service = ServiceBusService(
service_namespace ='thenamespaceofyourservice',
shared_access_key_name = 'youracceskeyname',
shared_access_key_value ='youraccesskey')
queue_options = Queue()
queue_options.max_size_in_megabytes = '5120'
queue_options.default_message_time_to_live = 'PT1M'
To get a message from the service bus use:
msg = bus_service.receive_subscription_message('nameofservicebus', 'nameoftopic')
To see the message:
print(msg.body)
If you do this, you probably get a message that starts with something like this before being followed by something that looks like a dict or JSON
b'@\x06string\x083http://schemas.microsoft.com/2003/10/Serialization/\x9a\xc9\x01
Getting the Message OutFiguring out how to get the information out from this message was probably what took me the longest to do.
I have written it the entire way forward under here, and tried to comment in the text.
#Changed encoding="ascii" to "utf-8"
def decode(s, encoding="utf-8", errors="ignore"):
return s.decode(encoding=encoding, errors=errors)
#decode the message received. The decoding used is uft-8
data_decode = decode(msg.body)
#Make a function that can take a string from before a certain char
def before(value, a):
pos_a = value.find(a)
if pos_a == -1: return""
return value[1:pos_a]
#remove string before { from the data as well as remove the first char (@)
data_cutstring = before(data_decode, "{")
data_substring = re.sub(data_cutstring, '', data_decode)
data_substring = data_substring[1:]
#Replace all null with a zero and remove the last char from string
data_remove_final = data_substring[:-1]
data_remove_final = data_remove_final.replace("null","0")
data_json = json.loads(data_remove_final)
If you do a print of data_json you should get a nice looking dict type file.
For further reading:
Comments