Software apps and online services
In Uttarkhand, Nepal and most recently in Kerala, floods have caused a lot of devastation resulting in a tremendous damage to person and property. Due to the accumulated heap of landmass spread over a large area after the flood, it was difficult to carry out the search and rescue operations or navigate manually. This led to numerous loss of lives due to the unavailability of emergency support. Another recent incident that occurred in Thailand has caused 13 people getting trapped in a cave. With the increase in the rate of natural and man-made calamities, we need to recheck our ability to perform an emergency response operation without any delay.
With the help of latest technology, it is now possible to reduce the delay in response required to provide help and support to an emergency scenario such as a disaster, calamity or an incident. It drastically reduces the amount of workforce required for carrying out highly sophisticated tasks. Using a connected autonomous system consisting of aerial (Drone) and ground vehicles (Donkey car), a scene can be monitored in real time and instant support can be provided.
Keywords: Connected autonomous systems, drone, donkey car, emergency response
- Autonomous aerial and ground operation
- Aerial and terrain mapping
- Image segmentation and analysis for safe zones
- Path planning and autonomous navigation
- Obstacle detection and avoidance
- Emergency assist and Payload transportation
- Connected systems with multiple units
- Automatic Solar charging
- Live monitoring and control
- Mobile application support
-----------------------------------------------------------------------------------------------------------------Workflow and Explanation:
As the project unfolded, complexity, design, time and resource constraints were experienced. This caused some variations in the implementation of the original proposed idea. However, this project will be live either till the requirements are met or if additional requirements are discovered.
This project guide however explains all the basic steps and procedures that are performed towards attaining the final goals of the project. I wont be expressing in highly technical contexts. But will try to keep this as simple as possible so that anyone referring to this guide would be able to infer and understand the concept. Also please keep in mind that I would like to keep this guide as compact as possible for the sake of readability. So, wherever applicable, I will be referring to resources in other websites where detailed information can be obtained instead of repeating everything here. However, the most important points will be mentioned here.
This project is divided into two main sections:
1. Aerial drone (Eye In The Sky)
2. Ground vehicle (Ground Scout)
First of all let me introduce our eye in the sky. We call this machine... the ASPIRE.
What ASPIRE basically does is that, it is tasked to follow certain way-points, scan that area and if finds some nasty things, report it back to the home base. In our case, we provide the way-point information. It traverses each of these way-points automatically and checks if there are any humans present in the area with the help of the attached camera. If it happens to find someone, their location is reported back to the base.
Hardware implementation of ASPIRE:
ASPIRE can be any multi-rotor, equipped with Raspberry Pi 3 as a companion computer running on Ubuntu Mate, in additional to the Pixhawk 4 flight controller. We have used an S500 frame to build ASPIRE. The bldc motors are of 920KV boasting a 9045 propeller on each. Details of building a multi-rotor are available online which I wish not to repeat here again. The connection between Raspberry Pi and Pixhawk and how to configure them is explained in detail here. For creating automatic way-pintin a variety of softwares such as QGroundControl, Mission Planner etc are available. Drone-kit is used for the simulations that were done to finalize the mission capabilities of the drone. There is a great set of tutorials by Tiziano Fiorenzani on setting up and using drone-kit in various drone applications. We used FlytOS and its APIs to define and execute way-point missions on the real hardware as it is based on ROS. Raspberry Pi is used to manage and execute these tasks, detect humans and report back the location to home base. A Logitech C270 HD Camera is used for the image capture. Rpi camera can also be used. Ublox NEO-M8N GPS Module (link) with Compass is used for localisation and navigation.
Software implementation of ASPIRE:
The software implementation for ASPIRE is purely based on Python, ROS and FlytOS. FlytOS is a software framework and platform that can be used to develop custom APIs for controlling a variety of drones. The back bone of this framework depends on ROS, MAVROS and MAVLink modules. Using FlytOS APIs, we are able to call functions that can carry out specific tasks such as drone takeoff, land, position control, way-point execution etc.
from flyt_python import api
#-- Setup people detection
person_cascade = cv2.CascadeClassifier('haarcascade_fullbody.xml')
cap = cv2.VideoCapture(0)
#-- People detection function using HAAR
detected = False
r, frame = cap.read()
frame = cv2.resize(frame,(640,360)) # Downscaling
gray_frame = cv2.cvtColor(frame, cv2.COLOR_RGB2GRAY)
rects = person_cascade.detectMultiScale(gray_frame)
for (x, y, w, h) in rects:
detected = True
cv2.rectangle(frame, (x,y), (x+w,y+h),(0,255,0),2)
print 'human(s) detected'
#-- instance of flyt drone navigation class
drone = api.navigation(timeout=120000)
#-- at least 3sec sleep time for the drone interface to initialize properly
print 'Drone ready'
print 'taking off'
print 'Executing pre-defined setpoints'
drone.position_set(5, 0, 0, relative=True)
drone.position_set(0, 5, 0, relative=True)
drone.position_set(-5, 0, 0, relative=True)
drone.position_set(0, -5, 0, relative=True)
#-- shutdown the instance
What this code basically does is, it calls an instance for the drone class defined in flyt_python class and carries out predefined tasks such as takeoff, survey etc. in a sequential manner. It also synchronously checks for people and when they are identified, produces an output at the ground control terminal screen.
Now, coming to the ground scout, ARORA is a terrain miniature vehicle capable of traversing either manually or autonomously. This vehicle can be any typical RC ground vehicle, preferably with a brushed DC motor, controlled by an ESC. The best example for this type is a Donkey Car. The advantage about donkey car is that it comes with autonomous capability via Raspberry Pi 3 and this is achieved by training the car. For setting up the car, training and producing a trained model, please refer to this exclusive official guide here.
Hardware implementation of ARORA:
The PCA9685 PWM driver that comes with the donkey car kit is capable enough to drive all the servos and motors that comes with it. But in case if you plan to attach more sensors or actuators, I would suggest you to use a Raspberry Shield like this which we have used, available here.
U-Blox NEO-6M GPS (link) is attached which gives the local position of the car. The interfacing of GPS with Raspberry Pi 3 is available in this documentation. GPS and compass based autonomous navigation will be later integrated.
We attached a couple of more sensors like HD camera for image processing and IR, SONAR for obstacle avoidance. Driving multiple SONAR is tedious than IR. Therefore IR is recommended in the short run.
Separate power supply unit must be provided for Raspberry Pi and the motor assembly. This is to avoid the motor noise getting into the power supply, that may cause instability. Still, both should have a common ground.
The ground vehicle is also equipped with optional solar panel cells and an automatic solar battery charging circuit for seamless and extended periods of operation, enabling recharging on-the-go.
Software implementation of ARORA:
A python program is developed to give custom commands to the ground vehicle. This program is run on the on-board raspberry pi and controls the vehicle using the motor drivers.
It also has the provision to control a pan-tilt servo system where the camera can be mounted.
Manual control of ARORA is implemented using a custom developed mobile application. It also provides live camera feed from the on-board HD camera. More details of this mobile app can be obtained from here. This app allows manual and automatic operation of the ground vehicle. Manual control is performed via a virtual joystick implemented in the app and automatic operation is using obstacle avoidance by the manipulation of various on-board IR sensors via the python program.
This system and the area of operation can be expanded further by adding multiple aerial and ground units. The units will be monitored via a single mission management console at ground control station.
The rest of the requirements if any are currently under development and will be updated as progress is made
- Nvidia Jetson TX2 / Avnet Ultra96 will be used as companion computer. It will also be used for faster AI processing and reliable stereo processing capability.
- Inclusion of thermal camera for analysis of unknown terrain and structures.
- Weather proofing of electronics.
- Amphibious upgrade - Upgrading the donkey car to operate on both land and water so as to increase the effectiveness of the mission in case water search and rescue is imminent such as in Thailand Cave incident.
- Donkey car will be fitted with detachable screw barrel tires so as to enable its easy movement on sand, water and snow.
- Weatherproofing of electronics using waterproof counterparts (ESC, Motors etc.), plastic encasing, water repellent paste and hot glued contacts.
- Setting up distributed system of drones and cars which covers a wide area of operation. A large area will be divided into sectors (depending on endurance of operation) and each sector will have a control centre equipped with a drone and a car. If required, backup drones and cars can be pulled form nearby sectors.
- A mobile application with SOS functionality which immediately sends the location of the user to the centralised monitoring system, alerting the team regarding an emergency at the user location. The drone will be immediately deployed for monitoring the location and the car will be kept in standby for response.
- Setting up emergency battery replacing stations for drones and cars so as to increase the endurance of operation (concept is already under our development).
****Since you read this far, we hope this project has caught your attention. No project venture is successful without the valuable feedback and criticism. Therefore, We welcome both from the fellow readers****