Researchers Draw Inspiration From Bats for an Ultrasonic "Perception Stack" Targeting Tiny Drones

Drawing milliwatts of power, this ultrasonic distance system can filter out the drone's own noise for accurate obstacle avoidance.

Scientists from the Worcester Polytechnic Institute, working with TDK InvenSense, have taken inspiration from bats to build a low-power ultrasound sensing system that could let tiny drones navigate even through dust and fog.

"Tiny palm-sized aerial robots have exceptional agility and cost-effectiveness in navigating confined and cluttered environments," the researchers say by way of background to their work. "However, their limited payload capacity directly constrains the sensing suite onboard the robot, thereby limiting critical navigational tasks in Global Positioning System (GPS)-denied wild scenes. Common methods for obstacle avoidance use cameras and light detection and ranging (LIDAR), which become ineffective under visually degraded conditions such as low visibility, dust, fog, or darkness. Other sensors, such as radio detection and ranging (RADAR), have high power consumption, making them unsuitable for tiny aerial robots."

Researchers have taken inspiration from bats to develop a low-power ultrasonic "perception stack" for lightweight drones. (📷: Velmurugan et al)

The team's solution: Saranga, a low-power ultrasound "perception stack" inspired by bats. Using a dual ultrasonic sonar array, the drone sends out a signal and listens for the echoes — allowing it to detect obstacles that are hidden to camera- and lidar-based sensing systems.

The idea of ultrasound for obstacle detection in the dark or in visually-challenging environments isn't new — sonar's origins lie in experimentation with vessel detection through water — but comes with some challenges. Biggest among these is a very low signal to noise ratio, meaning it's hard to pick out actual obstacles; this is doubly true for propeller-based small drones, which generate their own high-pitch motor and rotor noise as they fly.

The team's fix is two-prong. First, they developed a method for filtering out only the ultrasonics produced by the drone's propellers. Second, they trained a custom neural network, designed for use with the Google Coral Tensor Processing Unit (TPU) accelerator, based on long-horizon ultrasound echoes — finding patterns under the noise which were hidden to traditional approaches. To tie everything together, they then implemented this in a platform which draws just milliwatts during active sensing.

The team tested the system both indoors and outdoors, showing a high level of success in varied environments. (📹: Velmurugan et al)

"We enabled a palm-sized aerial robot to navigate under visually degraded conditions of dense fog, darkness, and snow in a cluttered environment with thin and transparent obstacles using only onboard sensing and computation," the researchers say of their experimental proof. "We extensively evaluated our approach across three different forest environments with low, medium, and high tree densities. The aerial robot consistently avoided obstacles, achieving success rates of 90.9, 77.3, and 85.7% across all three scenarios."

The team's work has been published in the journal Science Robotics under open-access terms. More information is available on the project website, with code, datasets, and models available on GitHub under an unspecified license.

ghalfacree

Freelance journalist, technical author, hacker, tinkerer, erstwhile sysadmin. For hire: freelance@halfacree.co.uk.

Latest Articles