Lightweight AI for Autonomous Drones

Researchers developed a compact neural network that runs on a $21 Mango Pi single-board computer to give even tiny drones autonomy.

Nick Bild
4 months agoDrones
An insect-inspired algorithm allows for autonomous navigation (📷: Y. Zhang et al.)

When it comes to jobs like mapping, infrastructure inspections, and search and rescue operations, drones are second to none in their capabilities. In a moment’s notice, a dozen inexpensive aerial vehicles can be in the sky offering up a view that would be costly or impossible to get using technologies of the past. But much greater value is unlocked when these drones operate autonomously. Taking the human operator out of the loop allows operations to scale up massively while slicing budgets.

Even still, most applications today rely on human operators. While advances in sensing technologies and artificial intelligence algorithms have made autonomous operation possible, the computing hardware required to run the software tends to be bulky and heavy. That is a very bad combination for a battery-powered aerial vehicle that is already severely constrained with respect to flight time.

A group led by researchers at Shanghai Jiao Tong University wondered why we find ourselves in this present situation. After all, insects are extraordinarily light, yet they have no problems with autonomous flight. Driven by this insight, the team developed an insect-inspired drone control algorithm. It is lightweight — both in terms of its sensing and processing requirements — yet it still has proven itself to be effective.

The approach leverages a compact artificial neural network that processes ultra-low-resolution 12x16 depth maps captured by an Intel RealSense camera to make real-time flight decisions. Unlike traditional navigation pipelines, which separate tasks like mapping, localization, and control into discrete modules, this system operates end-to-end, directly converting sensor input into control outputs in one step. By removing the need to pass data between multiple modules, the system eliminates sources of latency and reduces cumulative error.

One network or not, the algorithm still needed to be compact to run onboard a drone. As such, it was built with just three convolutional layers. It was then trained using a custom simulator populated with geometric shapes such as cubes and cylinders to represent cluttered environments. The training leveraged differentiable physics, which allowed the physical dynamics of a quadrotor to be directly embedded into the process. This alignment of learning with physics helps the model internalize how drone movements interact with the world, making it more robust and generalizable.

The entire model weighs in at under 2 MB in size and can run in real time on a $21 Mango Pi embedded computing board, making it one of the most cost-effective autonomous drone navigation systems around. During testing, the drones were able to navigate indoor environments and avoid collisions, even in multi-agent swarm configurations, without any centralized planning or wireless communication between units.

Looking ahead, the team is exploring the use of optical flow (another biologically inspired sensing technique) as a replacement for depth maps, and is also working on improving the interpretability of their end-to-end learning system. Refinements such as these may ultimately lead to a future where lightweight autonomous drones rule the skies.

Nick Bild
R&D, creativity, and building the next big thing you never knew you wanted are my specialties.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles