Abstract: Wildlife conservation in expansive parks faces significant challenges, including the high cost of monitoring, slow response times during crises, and the difficulty of locating deceased animals. This report details the development of Project AeroGuard, a system that leverages Unmanned Aerial Vehicles (UAVs) and Artificial Intelligence to address these challenges. Our system introduces a two-tiered approach: first, it creates a dynamic census of animal populations by "digitally tagging" individuals using unique visual embeddings based on their features. This database serves as a ground truth for population management. Second, in response to events like a disease outbreak, the system can perform targeted searches for specific species. If an animal is not found, the system initiates a secondary protocol to locate congregations of vultures, using their presence as a high-probability indicator for animal carcasses. We have developed and tested a species classification model (SpeciesNet), a LoFTR for image matching, and a web-based interface for data streaming. Project AeroGuard aims to provide park rangers with a cost-effective, efficient, and data-driven tool, transforming traditional conservation methods and enhancing wildlife protection capabilities.
1.1 Background and MotivationProtected areas, such as national parks, are cornerstones of biodiversity conservation. However, managing these vast landscapes presents immense logistical and financial hurdles. Traditional wildlife monitoring, often relying on manned helicopter or vehicle surveys, is prohibitively expensive, infrequent, and can cause stress to the animals. This leads to outdated census data and delayed responses to critical events such as poaching incidents, disease outbreaks, or animals in distress.
The advent of Unmanned Aerial Vehicles (UAVs or drones) combined with Artificial Intelligence offers a paradigm shift in conservation technology. Drones provide an affordable and low-impact method for aerial surveillance, while AI can automate the process of identifying, counting, and even re-identifying individual animals.
1.2 Project Goals and ObjectivesThe primary goal of Project AeroGuard is to develop an end-to-end, drone-based system to enhance wildlife monitoring and incident response for conservation authorities.
Our specific objectives are:
- To create a reliable and automated method for counting animals of various species within a park.
- To establish a "digital tag" for individual animals using AI-generated embedding, creating a searchable ground-truth database.
- To develop a rapid-response protocol to locate specific species during a public health or security crisis.
- To implement a carcass-identification method by correlating the location of vultures with potential animal mortality sites.
Park management authorities currently face the following critical challenges that Project AeroGuard aims to address:
- High Cost of Surveillance: Helicopter surveys can cost thousands of dollars per hour, limiting their frequency and scope.
- Lack of Real-Time Data: Annual or bi-annual census data is insufficient for managing dynamic situations like disease outbreaks.
- Inefficient Search Operations: Locating a specific sick, injured, or missing animal in a vast territory is like finding a needle in a haystack..
- Delayed Carcass Detection: Deceased animals, especially from disease or poaching, are often found too late for forensic analysis or for preventing further spread of disease.
Our methodology is structured into a two-phase operational workflow, designed to be deployed from a drone platform.
3.2 Phase 1: Digital Census and Ground Truth Creation.
- Data Acquisition: A drone equipped with a high-resolution camera flies a pre-programmed grid pattern over the survey area.
- Species Identification: The captured aerial imagery is processed by our SpeciesNet model. The model identifies the species of each animal detected (e.g., rhino, elephant, vulture).
- Digital Tagging: For each identified animal, a unique feature vector (embedding) is generated from its image. This embedding acts as a "digital tag."
- Database Population: The information—species, GPS location, timestamp, image, and the unique embedding—is stored in an onboard database. This database becomes the dynamic ground truth of the park's wildlife population. When new, untagged animals are encountered during subsequent surveys, they are added to the database.
Crisis Initiation: A crisis is identified (e.g., a suspected anthrax outbreak affecting rhinos).
- Crisis Initiation: A crisis is identified (e.g., a suspected anthrax outbreak affecting rhinos).
- Targeted Search: The system is tasked to find all animals of the target species (rhinos). The drone can be dispatched to the last known locations of these animals using data from the ground-truth database.
- Real-Time Verification: As the drone locates rhinos, it captures images, generates embeddings, and matches them against the database to confirm the identity of each individual. This allows rangers to systematically check off each animal.
- Alert Generation: If an animal cannot be located after a thorough search, it is flagged as "missing, " and its last known coordinates are provided to the ranger team.
If an animal is flagged as missing, the system initiates a secondary search protocol.
- It is scientifically noted that vultures possess highly acidic stomachs, allowing them to safely consume carcasses that might be toxic to other scavengers. This makes them exceptional natural indicators of animal mortality..
- The drone's mission is updated to survey the area around the missing animal's last known location, specifically searching for congregations of vultures.
- The SpeciesNet model, which can accurately identify vultures, flags any significant groupings.
- A high concentration of vultures on the ground is reported to the rangers as a high-probability location of a carcass, enabling rapid recovery and investigation.
The system was developed and tested using the following components. The final version is intended to be deployed on a field-ready UAV.
- UAV Platform: Mavic 3 Classic
Onboard Computing: Raspberry PI 5 8GB RAM with AI Hat
- Onboard Computing: Raspberry PI 5 8GB RAM with AI Hat
- Camera Sensor: 12MP with autofocus
We are using SpeciesNet to be able to identify up to 2, 200 animal categories; african elephant, black and white rhinos among others. SpeciesNet was trained with over 65 million images distributed globally making it quite useful for this purpose. SpeciesNet can also be able to identify 7 number of vulture species, turkey vulture, American black vulture, cinereous vulture, lesser yellow-headed vulture, greater yellow-headed vulture, king vulture and griffon vulture
The images below resemble the output of SpeciesNet. Figure 2 is data collected from a drone survey in Laikipia, which shows the elephants grazing and SpeciesNet accurately detecting the species. Figure 3 is an image of vultures sourced from Google, which SpeciesNet ends up identifying them as turkey vultures and they are feeding on the carcass.
For individual re-identification, we employed LoFTR (Local Feature TRansformer), a state-of-the-art local feature matching framework that eliminates the need for traditional keypoint detection. Each animal image is first encoded into a high-dimensional embedding vector that captures unique visual features. To determine whether a newly observed animal corresponds to an individual previously stored in the reference database, cosine similarity is computed between the new embedding and all existing embeddings. Cosine similarity is particularly well-suited for this task as it evaluates the angular distance between vectors, making it robust to variations in scale, lighting, and partial occlusions.
This approach enables efficient and accurate re-identification by (i) leveraging LoFTR’s ability to extract fine-grained local correspondences, and (ii) utilising cosine similarity for rapid vector comparison across large databases. Together, these components support scalable and non-invasive digital tagging of wildlife, facilitating longitudinal monitoring without the need for physical collars or tags.
4.4 User Interface and Data StreamingA proof-of-concept web-based user interface has been developed.
- Functionality: The website allows a user on the same network to view a live feed of what the system is detecting, including species labels and bounding boxes.
- Technology Stack: [Python Flask backend with a simple HTML/CSS/JS frontend]
- Testing: We successfully tested the streaming of this data from a local machine over a Wi-Fi network using ngrok, demonstrating the feasibility of remote monitoring.
Project AeroGuard has successfully demonstrated a feasible and innovative solution to some of the most pressing issues in wildlife conservation. We have used tools that have been made to aid in making the work of rangers and conservationists as a whole, from species identification to pointing out the correlation between vultures and carcasses and testing the re-id of the species for individual animal assessment as a digital tag. This project lays the groundwork for a scalable, affordable, and highly effective tool that can revolutionise how we protect and manage our planet's most vulnerable species.
Future WorkWhile the current system is a robust proof-of-concept, the following steps are planned for future development:
- Full Drone Deployment: Conduct extensive field testing with a drone in a controlled environment.
- GSM/Satellite Connectivity: Incorporate a GSM or satellite communication module to enable the real-time transmission of critical alerts (e.g., animal location, carcass probability) from deep within parks where Wi-Fi is unavailable.
- Sensor Fusion: Integrate thermal imaging sensors to allow for effective nocturnal surveys, which is critical for monitoring many species and detecting poachers.
- Real-time data: Currently running SpeciesNet +LoFTR takes ~3 minutes to inference. While this is okay, it causes a lag of 3 minutes before identifying and re-id the animal. We hope to leverage the ai hat fully to reduce this time.





Comments