These Drones Are Tracked and Controlled Using $1 Webcams and ESP32s

Joshua Bird's open source project allows users to control autonomous drones through a web interface and a series of ESP32 boards.

Overview

When it comes to indoor motion/object tracking, many people are familiar with certain styles of VR headsets that use "lighthouse" cameras in the corners of a room to triangulate the player's controllers. To achieve this, they utilize high-resolution imaging and points on the controller being sent to a PC for processing. Other applications of this technique have been applied to tracking drone swarms, except they rely on dozens or even hundreds of cameras mounted on the ceiling to function. As a much cheaper alternative, maker and student Joshua Bird has developed an open source system that uses $5 webcams and ESP32-powered drones to perform indoor way-point following.

Building the drones

At first, Bird looking into buying inexpensive off-the-shelf ESP32 drones such as the ESP-Drone but then realized that tuning the flight controller would be too much work. Instead, he purchased a brushed flight controller and bundled it with four DC motors + props, a battery, an ESP32, and three infrared LED emitters.

Since the flight controller expects UART input, he could write a simple program to receive incoming JSON flight data over the ESP-NOW protocol, perform some adjustments with a PID loop, and then send it over a single pin. In this configuration, each drone can achieve about 10 minutes of flying time and is fairly crash resistant.

A camera system

For the cameras, Bird picked out four PlayStation 3 webcams that had IR capabilities, and after replacing their IR-blocking filter with one that only allows IR light through, he was able to mount them around his dorm room. Each camera was then connected to his host PC via USB where a Python script, in combination with OpenCV and a Flask webserver, could grab the incoming synchronized frames.

Determining drone positions in 3D space

The most difficult part of this several-month project was figuring out how to take a series of images from different cameras at different angles and make them determine an object's position in 3D space. Calibration was achieved through the application of epipolar geometry where virtual lines between cameras would be drawn and then aligned to form a precise triangle. When repeated for each pair of cameras, the system could pick up a drone by aligning its IR markers on a set of rays through rotations and translations.

The webpage

Lastly, Bird created a web application that exposes a front-end full of various controls for the cameras and drones. In here, drones can be marked with set-points where they will attempt to hover and even return to if bumped off-course. One of the most interesting features is the trajectory generation tool that enables a user to enter a list of waypoints with movement values between them, view the 3D result virtually on a chart, and then execute it on the drones.

To see more about this project, you can watch Bird's demonstration video here on YouTube or read about it on his blog.

Evan Rust
IoT, web, and embedded systems enthusiast. Contact me for product reviews or custom project requests.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles