This Triple-Camera Hyperspectral Drone Can Scan an Entire Sesame Field for Signs of Plant Stress
A trio of cameras feeding a custom deep learning model deliver up to 90 percent accuracy for classifying crop nutrient and water stress.
Researchers from the Hebrew University of Jerusalem, the Virginia Polytechnic Institute and State University, the University of Tokyo, and the Volcani Institute have come up with a way to improve sesame farming yields — by packing a drone with a three different types of camera.
"By integrating data from multiple UAV [Uncrewed Aerial Vehicle]-imaging sources and training deep learning models to analyze it, we can now distinguish between stress factors that were previously challenging to tell apart," explains corresponding author Ittai Herrmann of the project. "This capability is vital for precision agriculture and for adapting to the challenges of climate change."
Plant stressors have a major impact in the eventual yield of crops, including sesame — but keeping an eye on them can be challenging. Physically visiting each field and checking by hand works, but is time-consuming and can't deliver rapid results. Remote sensor networks can deliver ongoing monitoring, but only in specific areas. The team's solution: a drone equipped with thermal, visible-light, and hyperspectral cameras, which can quickly capture data for an entire field system.
Data from the drone's cameras are fed into a machine learning system trained to spot key plant stressors, including a lack or excess of nitrogen and water. It's not the first time such an approach has been tried, but the results are a considerable improvement over the state-of-the-art: by using all three data sources, the team claims to have boosted classification accuracy for combined nutrient and water stress from 40-55 percent with existing approaches to between 65 and 90 percent.
The team's work has been published in the ISPRS Journal of Photogrammetry and Remote Sensing under closed-access terms.