Navigating Uncharted Skies

This control system gives drones the ability to autonomously track objects and navigate without GPS signals using low-cost hardware.

Nick Bild
2 years agoDrones
Experimental setup for testing the UAV control system (📷: Y. Chen et al.)

Drones, also known as unmanned aerial vehicles (UAVs), have unlocked new possibilities in many industries by providing versatile platforms for tasks such as surveillance, mapping, delivery, and disaster response. One of the key capabilities that allow drones to perform these tasks autonomously is their ability to navigate without constant human intervention. Autonomous navigation entails the drone's capacity to determine its position, plan a path, and adjust its trajectory based on real-time data, all without direct human control. This is accomplished through a combination of cutting-edge technologies, with GPS receivers playing a critical role.

GPS receivers communicate with a constellation of satellites orbiting the Earth, providing precise timing and location information to the UAV. By triangulating signals from multiple satellites, the GPS receiver can determine the UAV's position accurately. This information is then used by the UAV's onboard computer to calculate its trajectory, speed, and direction, allowing it to follow a predetermined flight path or respond to dynamic environmental changes.

However, despite their widespread use and effectiveness, GPS signals can sometimes be unreliable or unavailable. Various challenges can arise that affect GPS signals, such as signal interference, urban canyons (tall buildings obstructing the signals), and adverse atmospheric conditions. When a UAV loses GPS signal, it can face difficulties in maintaining its intended course.

In cases where GPS signals are lost, alternate methods must be employed for a UAV to determine its position. Vision-based control systems are frequently used in these cases, which use cameras and image processing algorithms to gather information about the environment that the drone finds itself in. One technique in particular, called visual servoing, which allows for precise relative positioning of a UAV with respect to an object of interest, was targeted for some improvements by researchers led by a group at Fuzhou University in China.

Tracking targets via visual servoing is a low-cost solution for a variety of UAV control tasks. By focusing specifically on image-based visual servoing, in which pose reconstructions are not required, the researchers were also able to eliminate the need for power hungry computational hardware, as the algorithms are relatively lightweight. However, any noise that enters image-based visual servoing calculations is known to negatively impact the accuracy of translational velocity measurements — and the accuracy of these measurements is critically important for the proper functioning of the flight controller.

Some other known problems exist with this visual servoing technique. For example, when the target object is rotating, it makes image depth estimation very challenging. Moreover, UAVs often experience a number of disturbances while in flight that arise from unpredictable, external sources — these must be dealt with for accurate target tracking.

To address these challenges, the team proposed a number of enhancements to improve the accuracy of image-based visual servoing. A velocity observer was developed to estimate the relative velocities between the UAV and the target. This observer eliminated the need for translational velocity measurements, thereby sidestepping the control problems caused by noise in those measurements.

A novel image depth model was also developed, which enables the system to track objects in any arbitrary orientation. This enables accurate object tracking and the calculation of smooth trajectories even when the target is rotating. Unpredictable disturbances were also accounted for through the introduction of an integral-based filter that enhances tracking stability.

The Lyapunov method, which is used to assess the stability of nonlinear systems, was leveraged to analyze the stability of this custom image-based visual servoing controller. When tracking a dynamic, rotating target, these experiments demonstrated the tracking stability and robustness of the controller, as well as its ability to operate acceptably in the presence of unexpected disturbances.

Moving forward, the team plans to refine their methods and apply them to practical, real-world scenarios involving capturing dynamic targets and making autonomous landings.

Nick Bild
R&D, creativity, and building the next big thing you never knew you wanted are my specialties.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles