This project focused on developing an embedded AI system for detecting diseased leaves, motivated by the spread of the Japanese beetle in Milan. Using an ESP32-S3 microcontroller with an OV2640 camera, we collected and processed a dataset of linden tree leaves, expanding it through data augmentation to improve model robustness.
Different experiments were performed on Edge Impulse, where several neural network architectures, input resolutions, and hyperparameters were tested. Transfer learning with MobileNetV2 achieved the best accuracy while remaining within the ESP32’s memory constraints. We also analyzed model interpretability through Class Activation Maps (CAM), confirming that relevant leaf features were correctly identified.
In parallel, we developed a custom model based on SqueezeNet in Google Colab. Training from scratch on the small dataset yielded poor results, but applying transfer learning from the larger Imagenette dataset significantly improved accuracy, reaching 86%. The quantized version of this model was successfully deployed on the ESP32.
Final deployment tests demonstrated that the system can capture leaf images, perform real-time classification, and transmit results to a host computer. The project validated the feasibility of lightweight AI for environmental monitoring on resource-constrained embedded devices.
Comments