Interpreting Smoke Signals with AI and Drones
UMN engineers used drone swarms to map wildfire smoke in 3D, improving predictions, early warnings, and tracking of airborne hazards.
Summer may be winding down in the Northern Hemisphere, but the threat of wildfires is still at the forefront of the minds of the many people who live in regions prone to dry conditions. The effects of these blazes can extend for many hundreds of miles beyond their sources. Large wildfires produce massive smoke plumes that carry particles to distant locations, where they degrade air quality and give the sky a hazy appearance. That is bad news for everyone in those areas, but especially for individuals with asthma, COPD, or other conditions that impact the lungs.
Short of preventing the wildfires in the first place, there is little we can do to avoid these air quality problems. But now we will at least be better able to predict what regions will be impacted, thanks to the work of a team of engineers at the University of Minnesota. They have developed a system that uses artificial intelligence (AI)-powered aerial robots to predict where smoke is headed by inspecting plumes. That early warning could give the most vulnerable people time to prepare or leave the area.
The system relies on the deployment of coordinated aerial robots that can detect smoke, fly directly into plumes, and capture data from multiple perspectives. Using this information, the system generates detailed three-dimensional reconstructions of how smoke disperses over time. Being so close to the smoke enables the system to capture data with a level of resolution that is not available to traditional observation tools like satellites or ground-based lidar.
The architecture of the system consists of a “manager” drone and several “worker” drones, each equipped with high-resolution cameras and GPS modules. The manager drone autonomously positions itself above a smoke plume and directs the worker drones to orbit in a synchronized circular pattern. This coordinated effort ensures that there is sufficient multi-angle coverage of the plume. The collected images are then processed using a technique called Neural Radiance Fields (NeRF) to produce high-resolution 3D reconstructions. In field tests, the drones successfully captured critical plume characteristics such as shifting wind directions, lofting behavior, and changes in volume at a temporal resolution of about one second.
The potential applications of the technology could extend beyond wildfires. The team believes that their system could be adapted to track sandstorms, volcanic eruptions, and other airborne hazards. And because the drones are relatively inexpensive compared to satellite systems, the approach could allow for widespread deployment in regions where such natural hazards regularly occur.
Looking ahead, the team plans to integrate more advanced sensors into their drones, including holographic imaging systems capable of analyzing the composition of airborne particles in even greater detail. They are also testing fixed-wing drones with vertical takeoff and landing capabilities, which can fly longer missions without a runway.
If this research has piqued your interest and you think you have some innovative edge AI solutions of your own that address critical environmental challenges, you should definitely check out the Edge AI Earth Guardians competition. Not only can you do some good for our favorite little blue planet, but you can also win some big prizes!