Like many ambitious projects, Aegis Control started not with a blueprint, but with a question: How could I build a truly intelligent system that meaningfully integrates ground-based robotics with an autonomous aerial component?
I've always been fascinated by the intersection of the physical and digital worlds. My workshop is filled with drones, sensors, and microcontrollers—each a powerful tool on its own. The ambition for this project was to make them work together, to create a system where the whole was greater than the sum of its parts, fueled by the exciting possibilities of modern AI.
The first step was giving the idea a physical form. The turret and drone couldn't just be functional; they had to look and feel the part. Drawing inspiration from my favorite sci-fi and gaming aesthetics, I began shaping the digital clay. Dozens of hours were spent in 3D modeling, designing every component from the ground up to be rugged, printable on my Bambu Lab machine, and visually striking. This wasn't just about building a case; it was about creating a robotic personality.
With a body, the system needed senses. This was the deep dive into AI and computer vision. I chose the YOLOv8 model for its power and efficiency, and began the challenging process of training it to recognize drones. Seeing the bounding box appear correctly on the video feed for the first time was the magic moment—the proof that this core concept was viable. While the initial response was slow, the path forward was clear: the AI brain could see, and now the journey of optimization could begin.
Of course, none of it would work without a robust nervous system. I designed a custom, multi-voltage power distribution system using dedicated buck converters and MOSFET controls—an engineering choice to ensure the power-hungry servos wouldn't interfere with the sensitive Raspberry Pi. The entire system is controlled with a wireless remote relay switch.
Finally, it all needed to be tied together with software. I developed a Flask-based web server to provide a clean, intuitive control interface for manual operation, and configured the drone with iNav to unlock its autonomous GPS capabilities.
Aegis Control, as it stands today, is the culmination of that journey. It's a V1 platform with a proven V2 soul. It's a testament to what's possible in a home workshop when you combine 3D printing, custom electronics, and the power of artificial intelligence. This project pushed my skills in every domain, from mechanical design to machine learning.
My hope is that it will inspire you to do the same. This guide will walk you through how to build your own Aegis Control system. The complete STL file package is available for purchase on my Printables page. Thank you for checking it out. Let's get building.
Bill of Materials (BOM)Please see the dedicated "Things" section of this project for the complete, Bill of Materials for all required components. Most material was purchased on Amazon. You can copy paste the item name to Amazon and find exactly what you need.
Build InstructionsStep 1: 3D Printing the PartsA successful build starts with high-quality prints. It's recommended to print parts like the legs and yoke arms oriented for maximum strength along their stress axes.
To get started, purchase and download the complete, print-ready STL file package from my official Printables page:
- Recommended Material: PETG or ASA for their strength and thermal stability. PLA works as well if that's all you have.
- Recommended Infill: 25-40% using a strong pattern like Gyroid or Cubic.
- Walls/Perimeters: 3-4 for robust turret parts. 2 walls for drone parts
- Supports: Tree/Organic supports are recommended for easier removal and a cleaner finish on overhangs. Use a brim for any parts with a small footprint on the build plate to ensure adhesion.
Patience and precision here will result in a smooth, reliable turret. You will need a standard M3/M4 screw assortment.
Base Assembly: Hub
- Secure the six legs to the main hexagonal hub using M3 screws.
- Install the main pan servo into its housing within the hub, ensuring it is centered. Attach the 6mm D-Shaft Servo Hub 25T and 6mm D-Shaft (Stainless Steel, 20mm Length) to the servo.
- Optional: install threaded inserts into the holes where the hub top will secure to the hub body. I simply used a soldering iron to press the inserts into place.
Yoke & Tilt Mechanism Assembly: This is the most critical mechanical step.
- Press the tilt servo into one of the yoke arms. Secure its mounting tabs with screws. Attach the servo horn 25T to the output spline.
- On the opposite yoke arm, insert a 5mm diameter steel rod through the pivot hole. This will act as the non-driven axle.
- Attach the turret head between the two yoke arms, securing it to the tilt servo horn on one side and allowing the rod to pass through the other.
- Slide a set screw collar onto the end of the 5mm rod on the outside of the yoke arm and tighten it, leaving no slop. This creates a strong, precise, and smooth pivot point.
Final Assembly: Attach the Rigid Flange Shaft Coupling 6mm to the bottom. Mount the completed yoke assembly onto the pan servo in the main hub. Ensure your wiring has a clear path and will not bind during rotation.
Step 3: Electronics & WiringPlan your component layout inside the main hub before permanently mounting anything. Please refer to the schematic PDF's.
Tune Buck Converters: Before connecting to your Pi or servos, power each buck converter and use a multimeter to adjust the potentiometer. Tune one to exactly 5.1V (for the Pi) and the other to 8V (for the servos).
Power Distribution:
- Connect the 20V battery to the "Power Wheels Adapter" then the output to your wireless power switch.
- Wire the switch output to the inputs of both tuned buck converters.
- Route the 8V output through the High Power MOSFET module. This module's signal pin will be controlled by a Pi GPIO, acting as a software power switch for the servos.
- Connect the 5.1V output to the XT60 to Type-C Fast Charging Cable then to the Raspberry Pi 4b USB C power input
Logic Connections:
- Connect the Pi’s I2C Bus 0 pins (GPIO 0/SDA0 and GPIO 1/SCL0) and 5V/GND to the PCA9685 servo driver's SDA, SCL, VCC, and GND pins respectively.
- Connect the Pan and Tilt servo signal wires to channels 0 and 1 on the PCA9685
- Connect the 8V power from your MOSFET switch to the PCA9685's servo power rail (V+ terminal).
- Connect a wire from Raspberry Pi GPIO 17 to the Signal (SIG) Pin on the MOSFET Switch Module.
Wire Management: Cut wires to length, use heat shrink tubing on all solder joints, and use small zip ties to bundle cables neatly inside the hub.
Step 4: Raspberry Pi & Software SetupFlash OS: Use the Raspberry Pi Imager to flash the latest headless Raspberry Pi OS (64-bit) to a microSD card.
Initial Config: Boot the Pi and run sudo raspi-config.
- Under
Interface Options
, enable SSH, I2C, and Legacy Camera Support. You may need to specifically enable I2C0 in/boot/config.txt
if it's not enabled by default, as this is the bus used by the code. - Set your WiFi country, locale, and timezone.
Install Dependencies: Please see the requirements.txt document.
Special Case: TensorFlow Lite Runtime
- The TFLite Runtime library cannot be installed directly with a simple pip install command. You must install a specific version that matches your Raspberry Pi's architecture and Python version.
- You will need to download the correct whl file from Google's repository and install it manually. For example, for a 64-bit Raspberry Pi OS with Python 3.9, the command would look something like this:
- pip install https://github.com/google-coral/pycoral/releases/download/v2.0.0/tflite_runtime-2.5.0.post1-cp39-cp39-linux_aarch64.whl
- You will need to find the correct link for your specific setup.
System-Level Dependencies
Before the Python libraries will install and run correctly, you often need to install underlying system libraries using apt
.
sudo apt-get update && sudo apt-get install -y \
libatlas-base-dev \
libhdf5-dev \
libhdf5-serial-dev \
libjasper-dev \
libqt4-test \
libqtgui4 \
libffi-dev
Notes on System Dependencies:
- Picamera2: The
picamera2
library is now often included with recent versions of Raspberry Pi OS, but if not, it can have its own system dependencies. - NetworkManager: For the
python-networkmanager
library to function, your Raspberry Pi must be configured to use NetworkManager as its primary network service, which is not always the default.
Network Configuration: This advanced step allows for field use. Follow the guide to install hostapd
and dnsmasq
and use the network_manager.sh
script to handle automatic switching between connecting to a known WiFi and creating its own Access Point.
- The provided Python code contains experimental routes (
/setup
,/scan-wifi
, etc.) for configuring WiFi via a web page. These routes requiresudo
privileges and are not recommended for a final, secure build. - These instructions detail the recommended and more secure method using
hostapd
anddnsmasq
to create a system-level script that automatically switches between a known WiFi and an Access Point mode. It is advised to follow this guide and remove the experimental networking routes from the Python script.
System Service: Create and enable the turret.service
file in /etc/systemd/system/
. This is also located in the attachments section of this project. Be sure to change all <<< >>> noted variables with your setup. This service will run the network script first, then launch the main Python controller automatically on every boot.
To get the best performance for video streaming and AI, and to enable the correct hardware interfaces, we need to edit the Pi's main configuration file. Open it by running sudo nano /boot/firmware/config.txt and add the following lines to the bottom:
# Boost performance for AI and video processing
# NOTE: Requires a good heatsink and fan!
over_voltage=6 arm_freq=2000
# Allocate more memory to the GPU for the camera
gpu_mem=256
# Explicitly enable the I2C bus used by the PCA9685 in the code
# This is crucial as the code specifies BUS 0
dtparam=i2c_arm=on dtoverlay=i2c0
Step 5: Drone Assembly & iNav ConfigurationAssembly & Wiring: Assemble the 3D printed drone frame. Carefully wire the motors, GPS (to a UART), ESP32 (to another UART), and your radio receiver (to a third UART) to the SpeedyBee F7 flight controller.
Flashing: Install the iNav Configurator on your PC. Download the SPEEDYBEEF7MINI
iNav firmware and flash it to the flight controller in DFU mode.
iNav Configuration:
- Setup Tab: Calibrate the accelerometer on a level surface.
- Motors Tab (PROPS OFF): Verify motor directions and order.
- Ports Tab: Enable "Serial RX" for your receiver, "GPS", and "MSP" for the DroneBridge ESP32 module on their respective UARTs.
- Failsafe Tab: Set the failsafe procedure to RTH (Return To Home). This is a critical safety step.
- Modes Tab: Set up switches on your radio to ARM the drone and to manually trigger flight modes.
Setting up Arduino IDE for ESP32
- Install Arduino IDE:Download and install the latest version of the Arduino IDE (2.x is recommended) from the official Arduino Software page.
- Add ESP32 Board Manager URL:
- Open the Arduino IDE.
- Go to
File > Preferences
(orArduino IDE > Settings...
on macOS). - Find the text box labeled "Additional Board Manager URLs".
- Paste the following URL into the box:
https://raw.githubusercontent.com/espressif/arduino-esp32/gh-pages/package_esp32_index.json
- Click "OK".
- Install the ESP32 Boards Package:
- Go to
Tools > Board > Boards Manager...
. - In the search bar, type
esp32
. - Find the entry named "esp32 by Espressif Systems" and click the "Install" button. This will download and install the necessary files.
- Select Your ESP32 Board & Port:
- Plug your ESP32 board into your computer via USB.
- Go to
Tools > Board > esp32
and select the specific board you are using (e.g., "ESP32C5 Dev Module" or a generic equivalent like "ESP32 Dev Module"). - Go to
Tools > Port
and select the COM port (on Windows) or/dev/tty.usbserial
port (on Mac/Linux) that your ESP32 is connected to. (You can unplug and replug the board to see which port appears). - Upload a Test Sketch:
- Go to
File > Examples > 01.Basics > Blink
. A new window with the Blink sketch will open. - Click the "Upload" button (the right-arrow icon) in the top left.
- The IDE will compile and upload the code. If successful, the onboard LED on your ESP32 should begin to blink.
You are now ready to copy, paste, and upload the ESP32 listener code that's in the attachments section of this project.
Setting up communication from Raspberry Pi to ESP32 instructions are in attachments section of this project.
Step 6: AI Model Training (The V2 Brain)Data Collection: A great model needs great data. Collect hundreds of images and videos of your target drone in varied lighting, angles, and backgrounds. There's also many collections of images available on Roboflow.
Annotation: Use a platform like Roboflowto upload your dataset and draw bounding boxes around the drones in every image.
Training: Using a GPU-powered PC with the ultralytics
YOLOv8 library, export your dataset from Roboflow and begin training.
Deployment: Once you have a trained model (best.pt
), export it to the TensorFlow Lite format using INT8 quantization. To work directly with the provided code, you must name the final files exactly as follows:
- Your model file must be named:
best_int8.tflite
- Your labels file must be named:
labels.txt
- Copy both files to your project directory on the Raspberry Pi.
- Subsystem Checks: Power on the turret. Before running the full application, test each subsystem. Does the web interface load? Can you stream video? Do the servos respond to direct commands?
- Range of Motion: Manually test the pan and tilt servos to ensure they have a full, bind-free range of motion. Adjust software limits as needed to prevent stalling.
- Performance Tuning: The Python script contains several parameters that you can adjust to fine-tune the turret's auto-tracking performance.
TRACKING_P_GAIN
: Start with the default of 0.08. If the turret is too slow to react, increase this value slightly. If it overshoots the target, decrease it.SLOWDOWN_DISTANCE
: The default of 120 controls the motion easing. Adjust this value to change how smoothly the turret stops.CONFIDENCE_THRESHOLD
: The default of 0.4 means the AI will only track objects it is 40% confident about. Lower this to track more tentative detections, or raise it to only track very certain ones.- Drone Pre-flight: For the drone, perform all pre-flight checks (props off!), ensure a solid GPS lock with 10+ satellites, and conduct a safe, low-altitude hover test before attempting any autonomous modes.
Comments