Our bird feeder is continuously raided by squirrels, chipmunks, and other non-bird-type-animals โ to the point where "bird feeder" is almost a misnomer. So I wanted to create a bird-only feeder, leveraging the powerful object detection capabilities of the Arduino UNO Q. ๐ซ๐ฟ๏ธ
Getting Started ๐ฐ ๐ฉโ๐ปThe UNO Q is incredibly flexible to work with, since it can be connected directly to your computer like a typical Arduino dev board in PC Hosted Mode, but also connected to remotely in Network Mode, or even become a fully-fledged computer with Single-Board Computer Mode. After initial setup, I connected a keyboard, mouse, and monitor via a USB-C hub, and started exploring the UNO Q's development environment, App Lab, directly on the device. The Detect Objects on Camera example sounded germane to my concept, so I connected a USB webcam and tried it out.
The Detect Objects on Camera example uses App Lab's video_objectdetection Brick, which adds drag-and-drop object detection to your project. I was impressed by the speed and accuracy of the pre-trained model, which uses Edge Impulse's FOMO (Faster Objects, More Objects) machine learning algorithm under the hood. I was originally expecting to have to train my own model, but using yolox-object-detection with the video_objectdetection Brick reliably detected birds, saving me a ton of effort! I did find that the object detection algorithm pushed the board near its limits, making it hard to interact with the GUI while it was running in SBC mode, but reducing the screen resolution to 1280ร720 seemed to help.
While the UNO Q can function as a Single-Board Computer (SBC), it also has an STM32 Microcontroller (MCU), which is more similar to the brains found in typical Arduino dev boards. This facilitates interaction with sensors and actuators, and the Bridge Library allows the SBC and MCU to communicate, for example moving a servo via the MCU-connected pins in response to something the Linux portion detected with the webcam.
A simple Python script starts the detection, and executes a callback function when an object is detected:
detection_stream = VideoObjectDetection(confidence=0.5, debounce_sec=5.0)
def bird_detected():
Bridge.call("bird")
print("Bird detected!")
def anything_detected(detections: dict):
print("All detections:", detections)
detection_stream.on_detect("bird", bird_detected)
detection_stream.on_detect_all(anything_detected)The bird_detected() function is called when a bird is detected, while anything_detected() is called when anything is detected, to facilitate debugging and get a better idea of what the camera sees. Both use print() to make it clear that they have been triggered, but bird_detected() also calls the bird() function in the Arduino sketch portion of the project via the Bridge.
In addition to outputting "bird!" to the serial monitor to help make it clear that the function was executed, the bird() function sweeps a servo connected to digital pin D9 5 times, which pushes seed out through a flap in the reservoir for the detected bird to eat.
Now that my app was functioning as intended, I set it to Run at startup in App Lab and disconnected the keyboard, mouse, and monitor.
It was difficult to convince any birds or chipmunks to pose in front of the camera while I was outside, so I printed some photos and held them up to the camera. They worked well, but with the servo buried inside the bird feeder, the audible whine that I'd used as a signal that things were working became muffled. At first I used my beloved Openterface Mini-KVM to monitor the project, but wanted it to be able to run without any computer equipment connected eventually, so decided to leverage the UNO Q's 8ร13 LED matrix to help convey app state. Using the LED Matrix Painter example, I drew crude bird and X pixel art, and added the resultant hex arrays to my sketch:
const uint32_t bird_frame[] = {0x0e008808, 0x2084c401, 0xa0308602, 0x40000000};
const uint32_t nope_frame[] = {0x20408402, 0x400c0060, 0x04804204, 0x08000000};I then used the loadFrame() function to display the bird graphic when one was detected, and the X otherwise. This made it far easier to tell from a distance whether detections were accurate.
Note: the awesome web-based LED Matrix Editor, designed for the UNO R4's 8ร12 LED matrix, is not compatible with the UNO Q. ๐ฅฒ
Results and Conclusions โ๏ธ ๐Using yolox-object-detection with the video_objectdetection Brick provided instant out-of-the-box accurate bird recognition, saving a ton of time vs. training a custom model. The dual-brain SBC/MCU architecture of the UNO Q made it easy to actuate a servo in response to the detections on the Linux side. If you'd like to try it yourself, connect a USB webcam and powered USB-C hub to an UNO Q, a servo to 5V/GND/D9, and import my GitHub repo zip into App Lab. I was blown away with how quickly and easily I was able to implement my idea! ๐ง ๐ฅ
Despite the success I had with the premade model, I would still like to train my own model using local fauna โ perhaps instead of excluding chipmunks and squirrels, I could give them their own separate food when detected, using a second servo and seed reservoir. Or take it in the opposite direction and implement some kind of deterrent! I'd love to replace the cardboard reservoir wall/servo mount with laser-cut acrylic, and improve the seed scooping vs. the simple servo arm used. It's also considered a best practice to use an external power supply for the servo rather than the board itself. Weatherproofing and perhaps a small light when nighttime visitors are detected would help extend the device's utility further. Let me know in the comments if you build this project or have ideas to improve it! ๐ค๐








Comments