When you start driving drones with a remote control, it's like giving a small child the detonator of a bomb. At any time, it is possible for the drone to go out of control and poke someone's eye out.
Some examples of problems caused by bad drone handling:
"During a Christian Democratic Party campaign in September 2014, a Parrot AR drone crashed in front of German Chancellor Angela Merkel."
"What started out as a goofy holiday promotion ended terribly when a drone crashed into the face of Brooklyn Daily photographer Georgine Benvenuto"
https://www.techrepublic.com/article/12-drone-disasters-that-show-why-the-faa-hates-drones/
Personally, the first time I flew a drone, I almost cut a finger off of my friend, so I am saying it from my own experience.
The idea is to make a control system for the drone manipulated by gestures made with the SmartEdge Agile. Similar to the one you have in high-end drones but better and with any drone. (We are using a very cheap and small tello drone for this). Also, it will not use computer vision, but haptic movement manipulation!
Table of contents :
- Introduction
- System Diagram
- Brainium Config
- Raspberry Pi Configuration
- Laptop Configuration
- Demo
I'm going to create a system that can perform full control of a drone using gestures made with the SmartEdge Agile, instead of using a radio control system or an application on the cell phone.
The current solutions to the types of control over a drone are:
Radio control system: This type of systems are efficient for professional drone pilots. But, when an inexperienced person wants only to use a drone to have fun, it is very likely that the drone will end up on the pavement with a broken propeller.
https://www.dummies.com/consumer-electronics/drones/how-to-fly-your-drone-with-an-rc-transmitter/
Application: This type of systems are common in drones manipulated by Wi-Fi, such as the Tello, however these applications are not very intuitive and in general only make the drone manipulation more clumsy.
https://play.google.com/store/apps/details?id=com.ryzerobotics.tello&hl=en_US
By gestures captured by camera: this type of systems, although they are already implemented by some drones, are only capable of doing simple actions and do not have the ability to learn new commands.
https://www.heliguy.com/blog/dji-spark-gesture-control-tutorial/
Autonomous control systems by gps: This type of system, despite being the most expensive and precise, takes away everything interesting for a drone pilot since it literally flies itself.
https://www.droneomega.com/gps-drone-navigation-works/
My solution will generate a more intuitive and fun way to manipulate the controls of the drone, as well as doing cartwheels and so forth if the user wishes. Plus the ability to add new gestures to the system and make it more fun and interesting.
For this project, the following connection schematic represents the system's architecture.
Enter the motion recognition tab and created a new project.
In the project hub, in our case we create the movements that we will use to control the drone.
Table of Movements:
For each model you have to repeat the movements in different sessions (at least 2) and in my case 30 repetitions of the same movement. In order to generate the model you need to record all the sessions first.
In the case of each model you have to check the table to observe which movement needs more repetitions, for it to be recognized more easily.
To configure the MQTT-based communication from Brainium, we must create a widget that allows to send the last registered pattern.
- Create a new widget.
- Configure it as shown in the image.
- Select the device that will record the data.
- In the widget we can see the last recorded movement, this information will be sent via MQTT in the form of a JSON. The code for the raspberry is already configured to filter only the registered patterns.
If you are new to the raspberry world, consider setting up your raspberry with the following tutorial:
https://projects.raspberrypi.org/en/projects/raspberry-pi-setting-up
You need to have the following configuration to be able to perform Serial communications.
Once we have the raspberry configured we have to obtain the credentials for the MQTT connectivity via Brainium. These credentials shall be obtained in the following link.
https://spa.brainium.com/profile
Obtain these credentials:
- External access token
- User ID
Go into the device tab obtain your device id: (If you have not renamed your module instead of "Dedsec" you will see the device ID)
If you already renamed it and did not allocated the Device Id, you can find it by pressing the "+" button and you will see it below the name.
Input the credentials inside the "RaspCode.py" code.
mqtt_password = 'YOURTOKEN' # copy and paste here external client id from your account
user_id = 'YOURUSERID' # copy and paste here your user id
device_id = 'YOURDEVICEID' # copy and paste here your device id
In this case we will obtain all the commands that we send from the AI module as a command to the raspberry.
IMPORTANT: To connect the Serial USB module to the Raspberry, follow the diagram.
You have to have the following dependencies:
- Paho-Mqtt. (https://pypi.org/project/paho-mqtt/)
- TelloPy. (https://pypi.org/project/tellopy/)
- Install Python Anaconda and Install the following libraries. (https://www.anaconda.com/distribution/)
- Connect to the Tello Drone WiFi network.
- Connect the USBSerial, download and run the code "TelloSerial.py"
Thats pretty much it iterms of configuration. As you might have guessed we are first detecting the input at the Edge using the Brainium Agile device then sending it to the Brainium portal to perform AI and then via MQTT to the Raspberry Pi. After that, the Raspberry is simply connected via serial to a PC which then performs the Python script and results in the desired movement.
Demo:In this video we will present our EPIC Demo.
Business model and opportunityFor this we have to take into consideration two aspects of the technology. We are not really developing drones, but the haptic technology to control them so let's take a look at the haptic technology market:
We can see that the opportunity is quite big with a CAGR of 17% in the next few years. "Major factors driving the growth of haptic technology market is the increasing adoption of haptics technology into consumer electronics like mobile phones and tablets, gaming consoles, automotive sector and many others. Haptic devices have integrated tactile devices that measure the force exerted by users. Haptics provides with an enhanced user interface by taking the user experience to a whole new level." (*Marketresearchfuture.) That whole new level for us is haptic control for drones.
The global Haptic Technology Market is expected to grow at approx. USD 22 Billion by 2023, at 16% of CAGR between 2017 and 2023.
Now that we saw that the haptic market is in a good spot we have to take a look at the drone market.
The UAV market is estimated to be USD 20.71 Billion in 2018 and is projected to reach USD 52.30 Billion by 2025, at a CAGR of 14.15% from 2018 to 2025 according to Markets and Markets. While the growth rate is even smaller than the one of haptics, UAV's are a much older technology and their "hype" has died down a little bit. Nevertheless it is a huge market and one that is going to keep on growing and the combination of both technologies is what sets this project apart because is a much more immersive way of controlling them and the use of AI makes it even better as it could learn from users and adapt also to conditions.
For this I would take two approaches; one would be licensing the technology (after some more development and iterations) to a big drone corporation such as DJI or maybe develop a kit that can be used with any other drone or UAV.
Future rollout and commentsFor the next steps of the project I would like to replace the laptop with another Raspberry Pi, or another embedded computer and perhaps have the two devices in an enclosure. The main issue was that dual Wifis are needed for both the MQTT connection and the drone communications. That can probably be done also with a chipset with dual wifi chips, but for the moment this was available and worked perfectly.
Apart from that I actually think that haptic feedback is the way to go in the future for control systems. It is much more reliable than let's say just gestures controlled via computer vision as these require quite a lot of processing whether it is via edge or cloud. And the lag from the cloud at the moment is quite substantial and probably not recommended to control a drone. This project uses that but only for certain commands which are very precise. That is probably one of the caveats of the project but certainly can be addressed in the near future with improved technologies such as 5G that can reduce the lag. For the moment haptic feedback technologies are quite an exciting field to explore with various applications and I found that the Briniums Agile device is an exemplary device to explore those possible applications.
Referenceshttps://www.marketresearchfuture.com/reports/haptic-technology-market-4011
Comments