When animals are hunted to the point of extinction, it creates an ecological gap that can harm the health of the environment. This puts people at risk of losing access to clean air and water, and also fertile lands for agricultural activities. In spite of these concerns, many animals that are presently endangered are still being illegally hunted for profit or sport. This includes animals such as the highly endangered African black rhinoceros, of which only a few thousand still exist in the wild. It is a constant struggle for groups that are trying to protect these animals, because poachers are highly motivated by the enormous price tags placed on the horns of these animals, and keeping watch over a widely-dispersed animal population is technically very challenging.
The tools available to conservationists today are simply not meeting their needs. Satellite monitoring, for example, is not effective in accurately detecting any but the largest of animal species. There are also many types of tracking devices that can be physically attached to the animals, but these tend to be expensive and difficult to install in large numbers. Moreover, physically worn devices frequently stress animals and change their behaviors. A team led by researchers at the University of California, Berkeley is leveraging recent innovations in edge computing and machine learning that may make it possible to track animal populations efficiently, inexpensively, and unobtrusively.
Their idea was to use a Parrot Anafi quadcopter drone to build a wildlife spotting device that can cover large areas of rough terrain from the air. They paired this with an NVIDIA Jetson Xavier NX module to run the machine learning algorithms onboard the drone. Capable of operating at 21 teraflops, the team was essentially able to put a powerful supercomputer in the air to give them a leg up on poachers. The real time, local processing offered by the Jetson is essential for the device as wireless network connectivity is either spotty or non-existent on many wildlife tracking deployments. When a wireless connection, even a poor one, does become available, the drone can then send the results of the onboard analyses to notify researchers and authorities alike of any potential concerns.
In order to recognize specific animals, the team used a YOLOv5l6 object detection model. Initially focusing on the black rhino, the team collected a dataset of images from Namibia’s Kuzikus Wildlife Reserve and trained the model to recognize three classes — rhino, human and other animals. Further experimentation showed that adding the giraffe, ostrich, and springbok into the dataset further enhanced the model’s accuracy. Encouraged by this result, they added synthetic images created with SinGAN and Photoshop to increase the number of examples in the final dataset. Another round of training on this data resulted in a model that was capable of detecting black rhinos with an average accuracy rate of 81%.
The team has successfully shown that it is possible to build an aerial wildlife detector that can operate in areas with limited or no Internet connectivity, while maintaining a relatively low cost and remaining unobtrusive to the animals being tracked. These enhancements over presently available options may serve to aid in protecting both endangered species and the wellbeing of humans.