In early January 2025, prior to joining this contest, I was watching the news about the Southern California wildfires. At the time, hundreds of thousands were being evacuated. It was devastating to see people's homes and properties burn down as they stood by, helplessly watching. Firefighters were exhausted, working around the clock, yet the fire kept spreading, consuming every inch it could reach.
Driven by a growing desire to make a difference in the firefighting industry, I began researching how wildland firefighting works. To my surprise, I learned that most firefighters are limited to hand tools due to the long hikes and rugged terrain. Heavy equipment like bulldozers and chainsaws have limited use in these conditions. This greatly slows fire line construction, and reliance on aerial support can cost thousands to millions of dollars. It's estimated that operating a 747 Supertanker could cost up to $250, 000 per day!
Feeling a sense of urgency, I began thinking about designing an autonomous ground rover that could assist firefighters directly and enhance their situational awareness. Ultimately, while it's a difficult and complex challenge, developing a system capable of autonomously constructing fire lines could significantly reduce the risk to firefighters’ lives.
1. Research and Development (R&D)In 2024, the United States faced more than 60, 000 wildfires that burned over eight million acres. NASA’s data shows the scorched area has been creeping upward year after year. With crews stretched thin, I started asking myself a simple question: could a small, portable robot ease their workload and keep them safer?
As I dug into wildland-fire tactics, one method kept popping up—building a fireline. Firefighters strip away vegetation so the flames run out of fuel and stop spreading. It works, but it demands either heavy machinery or exhausting, back-breaking hand tools.
That sparked an idea. What if an autonomous robot could roll up to the fire’s edge, spot the hotspots, and carve a mini fireline—or even knock down small flames—on its own?
While reading up on how farmers cut into packed soil, I realized that the same digging tricks might help a robot scratch a narrow fire-break—and, with a few nozzles, even spray water or fire-retardant along the way.
The Guardian is my autonomous ground rover designed to back up wild-land crews. Even in this early prototype it can:
- drive itself to GPS waypoints,
- spot a flame or hotspot with onboard vision, and
- tackle fires in two ways—by carving a thin fireline and by dousing small flare-ups with a built-in pump.
At the heart of the rover is a Particle Boron 404X. Thanks to its 5G-ready modem, I can check in on the robot from anywhere the cell network reaches. The Particle cloud dashboard lets me watch telemetry, send commands, and push over-the-air firmware updates in one place.
Better yet, the Boron’s GPIO pins and on-chip microcontroller give me direct control of motors, sensors, and pumps—no extra electronics stack required.
Enough talk—let’s dig into the build.
I originally planned to 3D-print an entire tracked chassis from scratch. Then a friend offered me a spare set of tracks he no longer needed, so I grabbed it and saved weeks of work. With the drivetrain sorted, all I had to design was a sturdy 3D-printed deck where the electronics and onboard computer could bolt down neatly. (I found some 3D printed ones before if you are interested: 3D Printable Robot Tank Chasis, GrabCAD's 3D Printable Tank Chasis)
I could then use PCBWay or other printing services to do 3D printing of the platform to house my sensors and electronics.
With the chassis ready, I jumped to the fun part—making everything move with the Particle Boron 404X. I first grabbed an old Arduino to quickly test each motor, servo, and the water pump; a few blinks and spins later, I knew the wiring was solid. From there, porting the sketches to the Boron was almost a copy-and-paste job—the coding style is so close to Arduino’s IDE that the transition took minutes.
Here’s the big-picture flow:
- Jetson Orin — the “brain.” It crunches data from the camera, GPS, and IMU, figures out the next move, and passes those orders downstream.
- Particle Boron 404X — the “hands and feet.” It takes the Orin’s commands and actually drives the motors, steers the servos, and kicks on the water pump.
- 5 G link — the back channel. While the Boron is busy running hardware, it also keeps an ear on the Particle Cloud over 5 G LTE, so I can nudge the robot—or update its firmware—from anywhere with cell service.
In short: Orin thinks, Boron acts, and the cloud keeps us both in touch.
Here’s the quick rundown—and a sneak peek at the wiring diagram:
Two 11.1V LiPo batteries power the motor controller, which drives the tracked wheels. The system also provides 5V to power the 4-channel relay, water pump, and servo, since the Boron only supplies 3.3V. The Boron sends TTL signals via TX pin to the motor controller to regulate wheel speed and direction. It also sends digital signals via Digital Pins (High or Low) to control the relay channels that operate the water pump and servo.
2.1 Setup Cloud Solution with The Particle ConsoleThe Boron Console is the provided web platform to monitor your Particle devices.
When logged in, you’ll have to setup link your product, in my case, the Boron 404x, which you’ll then get a product ID to be linked onto the Visual Studio Code. Here’s a snippet of the interface when logged in, you’ll see your devices’ “health” and connectivity. It even provides the geolocation for you.
You can also perform Over-the-air code update, as well as triggering dedicated functions. Here I created two functions Command Movement and Send Waypoint, when called, it will send the command logic to the Boron on the Guardian to perform tasks.
Aside from that, there are other several metrics already built-in on the console, such as your cellular data usage (100 mb free per month!), battery life, and more. It also comes with webhook integration, and I’m super excited to be using this for other projects in the future.
With the existing extension in Visual Studio Code, we can quickly setup our devices and begin developing within this page:
Here’s a general order to follow:
Once logged in and setup your devices, you’ll link your local project device name to the same one on the console, then you can create the project which will provide you a <main.cpp> with the required libraries identical to an Arduino platform. (more setup instructions: Manual Setup, Getting Started)
Code Summary:
- Cloud hooks. Two Particle functions—
manualMove()
for quick joystick-style commands andsendWaypoint()
for latitude/longitude—let you poke the robot from the Particle Console or the Jetson’s higher-level planner. - Waypoint hand-off. When a new waypoint arrives, the Boron squirts a tiny JSON packet (
^{ … }$
) out over its USB serial port to the Jetson. That keeps the high-power computer in charge of global navigation. - Motor control. The Jetson then replies with single-character drive codes (
f b l r s
), which the Boron’sautoMove()
translates into one-byte speed commands onSerial1
—the Sabertooth motor driver’s packetized-serial interface—so no PWM timing is needed on the micro. - Dig & spray. A pair of GPIO lines flip a relay for the digging motor and another for the water pump; helper functions (
shouldDig()
,shouldWater()
) toggle them and enforce 30-second safety timeouts. - Heartbeat failsafe. If the Boron doesn’t hear a drive character for half a second, or a dig/spray refresh within 30 seconds, it drops everything to a safe stop.
/*
* Project The Guardian
* Author: Nathan H.
* Date:
* For comprehensive documentation and examples, please visit:
* https://docs.particle.io/firmware/best-practices/firmware-template/
*/
// Include Particle Device OS APIs
#include "Particle.h"
#define motorSerial Serial1 // tells compiler to replace all motorSerial as Serial1
SYSTEM_MODE(AUTOMATIC);
SYSTEM_THREAD(ENABLED);
SerialLogHandler logHandler(LOG_LEVEL_INFO);
const int relayPin = D6;
const int waterPin = D5;
unsigned long lastCommandTime = 0;
const unsigned long commandTimeout = 500;
unsigned long lastCommandDigTime = 0;
const unsigned long commandDigTimeout = 30000;
bool isDigging = false;
unsigned long lastCommandWatTime = 0;
const unsigned long commandWatTimeout = 30000;
bool isWatering = false;
float cmd_lat = 0.00;
float cmd_lon = 0.00;
bool newWaypointReceived = false;
int manualMove(String command);
int sendWaypoint(String command);
int setLED(String command);
int autoMove(String command);
void stopMotors();
void shouldDig();
void diggingOn();
void diggingOff();
void shouldWater();
void waterOn();
void waterOff();
void setup() {
Serial.begin(9600);
motorSerial.begin(9600);
Particle.function("Command Movement", manualMove);
Particle.function("Send Waypoint", sendWaypoint);
stopMotors();
pinMode(relayPin, OUTPUT);
digitalWrite(relayPin, HIGH);
pinMode(waterPin, OUTPUT);
digitalWrite(waterPin, HIGH);
}
void loop() {
if(newWaypointReceived){
Serial.print("^");
Serial.print("{\"lat\":");
Serial.print(cmd_lat, 6);
Serial.print(",\"lon\":");
Serial.print(cmd_lon, 6);
Serial.print("}");
Serial.println("$");
newWaypointReceived = false;
}
if (Serial.available()) {
char cmd = Serial.read();
lastCommandTime = millis();
autoMove(String(cmd));
}
if (millis() - lastCommandTime > commandTimeout) {
stopMotors();
}
if (millis() - lastCommandDigTime > commandDigTimeout) {
diggingOff();
}
if (millis() - lastCommandWatTime > commandWatTimeout) {
waterOff();
}
}
int sendWaypoint(String command){
int commandIdx = command.indexOf(",");
if (commandIdx == -1) {
return -1;
}
String latStr = command.substring(0, commandIdx);
String lonStr = command.substring(commandIdx + 1);
cmd_lat = latStr.toFloat();
cmd_lon = lonStr.toFloat();
newWaypointReceived = true;
return 1;
}
int setLED(String command) {
if (command == "on") {
digitalWrite(D7, HIGH);
return 1;
} else if (command == "off") {
digitalWrite(D7, LOW);
return 0;
}
return -1;
}
int manualMove(String command) {
unsigned long commandStart = millis();
unsigned long elapsedTime = 0;
char cmd = command.charAt(0);
if (cmd == 'd'){
shouldDig();
}
if (cmd == 'w'){
shouldWater();
}
while(elapsedTime < 2000){
elapsedTime = millis() - commandStart;
switch (cmd) {
case 'f':
motorSerial.write(127);
motorSerial.write(255);
break;
case 'b':
motorSerial.write(1);
motorSerial.write(128);
break;
case 'l':
motorSerial.write(1);
motorSerial.write(255);
break;
case 'r':
motorSerial.write(127);
motorSerial.write(128);
break;
case 's':
stopMotors();
break;
default:
break;
}
}
return 0;
}
int autoMove(String command) {
char cmd = command.charAt(0);
switch (cmd) {
case 'f':
motorSerial.write(127);
motorSerial.write(255);
break;
case 'b':
motorSerial.write(1);
motorSerial.write(128);
break;
case 'l':
motorSerial.write(1);
motorSerial.write(255);
break;
case 'r':
motorSerial.write(127);
motorSerial.write(128);
break;
case 's':
stopMotors();
break;
case 'd':
Serial.println("should dig triggered");
shouldDig();
break;
case 'w':
shouldWater();
break;
default:
break;
}
return 0;
}
void stopMotors() {
motorSerial.write(64);
motorSerial.write(192);
}
void shouldDig(){
if(!isDigging){
lastCommandDigTime = millis();
diggingOn();
Serial.println("on");
} else {
diggingOff();
Serial.println("off");
}
}
void diggingOn(){
digitalWrite(relayPin, LOW);
isDigging = true;
}
void diggingOff(){
digitalWrite(relayPin, HIGH);
isDigging = false;
}
void shouldWater(){
if(!isWatering){
lastCommandWatTime = millis();
waterOn();
Serial.println("water on");
} else {
waterOff();
Serial.println("water off");
}
}
void waterOn(){
digitalWrite(waterPin, LOW);
isWatering = true;
}
void waterOff(){
digitalWrite(waterPin, HIGH);
isWatering = false;
}
Up to this point, you should be already capable of commanding the robot via the Particle Console!!
Now that we've covered the Boron, our second-in-command, let's introduce our first-in-command: the Jetson Orin Nano. This main brain controls the entire robot, from sensing to navigation. As of this year, the Orin (Super) Developer Kit has been upgraded with doubled performance while dropping to half its original price. https://www.nvidia.com/en-us/autonomous-machines/embedded-systems/jetson-orin/nano-super-developer-kit/
There are three nodes running on the Jetson Orin
- Camera (Oak-D Pro W), which sends the RGB image via ROS 2 middle ware for fire detection
- Joystick Controller, which receives joystick inputs for manually driving the Guardian and controlling the water pump, tilting servos
- Guardian Navigation Node, which receives sensor data from the the GNSS/IMU to perform navigation corrections by comparing to the targeted waypoints
joy_to_serial_node.py
Full Code:
import rclpy
from rclpy.node import Node
from sensor_msgs.msg import Joy
import serial # from pyserial
from geometry_msgs.msg import Point
from std_msgs.msg import Float32, Bool
import json
class JoyPrinter(Node):
def __init__(self):
super().__init__('joy_printer')
try:
self.ser = serial.Serial('/dev/ttyACM1', 9600, timeout=1)
self.get_logger().info("Serial connection established.")
except serial.SerialException as e:
self.get_logger().error(f"Failed to open serial port: {e}")
self.ser = None
self.heartbeat_timer = self.create_timer(1.0, self.send_heartbeat)
self.get_logger().info("Heartbeat Started")
self.timer = self.create_timer(0.2, self.send_command)
self.latest_joy_msg = None
self.subscription = self.create_subscription(Joy, 'joy', self.joy_callback, 10)
self.get_logger().info("Listening to /joy...")
self.heading_error_sub = self.create_subscription(Float32, 'guardian/heading_error', self.heading_error_callback, 10)
self.distance_sub = self.create_subscription(Float32, 'guardian/distance', self.distance_callback, 10)
self.isFire_sub = self.create_subscription(Bool, 'guardian/isFire', self.fire_detect_callback, 10)
self.target_waypoint = Point()
self.target_waypoint.x = 0.0
self.target_waypoint.y = 0.0
self.target_waypoint.z = 0.0
self.heading_error = -999
self.distance = -999
self.target_waypoint_pub = self.create_publisher(Point, '/guardian/target_gps', 10)
self.autonomous_mode = False
self.serial_buffer = ""
self.read_timer = self.create_timer(1, self.read_serial)
self.isFire = False
def send_heartbeat(self):
if self.ser and self.ser.is_open:
try:
self.get_logger().debug("Sent heartbeat: H")
except serial.SerialException as e:
self.get_logger().error(f"Heartbeat failed: {e}")
self.ser.close()
self.ser = None
def send_command(self):
if self.latest_joy_msg is None:
self.get_logger().warn("No joystick input received yet.")
return
command = 's'
if self.autonomous_mode:
DISTANCE_TRESH = 3
DEG_TRESH = 10
HEADING_OFFSET = -22
if (self.distance != -999) & (self.heading_error != -999):
if self.distance > DISTANCE_TRESH:
if self.heading_error > DEG_TRESH:
command = 'l'
elif self.heading_error < -DEG_TRESH:
command = 'r'
else:
command = 'f'
self.get_logger().info(f"Sent command: {command}, heading error {self.heading_error:0.2f}, distance {self.distance:0.2f}")
else:
command = 's'
if self.isFire:
command = 'w'
self.isFire = False
else:
left_stick_x = self.latest_joy_msg.axes[6]
left_stick_y = self.latest_joy_msg.axes[7]
if left_stick_y == 1:
command = 'f'
elif left_stick_y == -1:
command = 'b'
elif left_stick_x == 1:
command = 'l'
elif left_stick_x == -1:
command = 'r'
else:
command = 's'
LB_button = self.latest_joy_msg.buttons[4]
RB_button = self.latest_joy_msg.buttons[5]
if LB_button == 1:
command = 'd'
if RB_button == 1:
command = 'w'
if self.ser and self.ser.is_open:
self.ser.write((command + '\n').encode('utf-8'))
else:
self.get_logger().warn("Serial not open.")
def joy_callback(self, msg):
self.latest_joy_msg = msg
button_a = msg.buttons[0]
self.autonomous_mode = button_a == 1
def heading_error_callback(self, msg):
self.heading_error = msg.data
def distance_callback(self, msg):
self.distance = msg.data
def fire_detect_callback(self, msg):
self.isFire = msg.data
if self.isFire:
self.get_logger().info("fire is detected!!")
def read_serial(self):
if self.ser and self.ser.in_waiting > 0:
try:
incoming = self.ser.read(self.ser.in_waiting).decode('utf-8', errors='ignore')
self.serial_buffer += incoming
while '^' in self.serial_buffer and '$' in self.serial_buffer:
start = self.serial_buffer.find('^') + 1
end = self.serial_buffer.find('$')
if end == -1:
break
json_str = self.serial_buffer[start:end]
self.serial_buffer = self.serial_buffer[end + 1:]
try:
data = json.loads(json_str)
lat = data.get('lat')
lon = data.get('lon')
self.get_logger().info(f"Received Waypoint from Boron: {lat}, {lon}")
self.target_waypoint.x = lat
self.target_waypoint.y = lon
self.target_waypoint_pub.publish(self.target_waypoint)
except json.JSONDecodeError as e:
self.get_logger().warn(f"Failed to parse JSON {e}")
except serial.SerialException as e:
self.get_logger().error(f"Serial read error {e}")
def main(args=None):
rclpy.init(args=args)
node = JoyPrinter()
rclpy.spin(node)
node.destroy_node()
rclpy.shutdown()
guardian_nav_node.py
import rclpy
from rclpy.node import Node
from rclpy.qos import QoSProfile, ReliabilityPolicy, HistoryPolicy
from sensor_msgs.msg import NavSatFix, Imu
from std_msgs.msg import Float32
from geometry_msgs.msg import Point
from tf_transformations import euler_from_quaternion
import math
def normalize_angle_180(deg):
angle = (deg + 180) % 360 - 180
return angle
def normalize_angle_360(deg):
return (deg + 360) % 360
def calculate_bearing(curr_lat, curr_lon, target_lat, target_lon):
curr_lat = math.radians(curr_lat)
curr_lon = math.radians(curr_lon)
target_lat = math.radians(target_lat)
target_lon = math.radians(target_lon)
delta_lon = target_lon - curr_lon
x = math.cos(target_lat) * math.sin(delta_lon)
y = math.cos(curr_lat) * math.sin(target_lat) - math.sin(curr_lat) * math.cos(target_lat) * math.cos(delta_lon)
bearing_rad = math.atan2(x, y)
bearing_deg = math.degrees(bearing_rad)
bearing_deg_norm = normalize_angle_180(bearing_deg)
return bearing_deg_norm
def calculate_distance(curr_lat, curr_lon, target_lat, target_lon):
R = 6371000
curr_lat_rad = math.radians(curr_lat)
curr_lon_rad = math.radians(curr_lon)
target_lat_rad = math.radians(target_lat)
lon2_rad = math.radians(target_lon)
delta_lat = target_lat_rad - curr_lat_rad
delta_lon = lon2_rad - curr_lon_rad
a = math.sin(delta_lat / 2) ** 2 + \
math.cos(curr_lat_rad) * math.cos(target_lat_rad) * math.sin(delta_lon / 2) ** 2
c = 2 * math.atan2(math.sqrt(a), math.sqrt(1 - a))
distance = R * c
return distance
def imu_to_compass_heading(yaw_deg):
return normalize_angle_360(-yaw_deg)
def compute_heading_error(target_deg, curr_deg):
return (curr_deg - target_deg + 540) % 360 - 180
class GuardianNavNode(Node):
def __init__(self):
super().__init__('guardian_nav_node')
self.target_lat = -1
self.target_lon = -1
self.curr_lat = -1
self.curr_lon = -1
self.yaw_deg = float(-999)
self.heading_error = float(-999)
self.distance = float(-999)
qos_profile = QoSProfile(
reliability=ReliabilityPolicy.BEST_EFFORT,
history=HistoryPolicy.KEEP_LAST,
depth=1)
self.gps_sub = self.create_subscription(NavSatFix, '/shield/gps', self.gps_callback, qos_profile)
self.imu_sub = self.create_subscription(Imu, '/shield/imu', self.imu_callback, qos_profile)
self.target_gps_sub = self.create_subscription(Point, '/guardian/target_gps', self.target_gps_callback, 10)
self.yaw_pub = self.create_publisher(Float32, 'guardian/yaw_deg', 10)
self.heading_error_pub = self.create_publisher(Float32, 'guardian/heading_error', 10)
self.distance_pub = self.create_publisher(Float32, 'guardian/distance', 10)
self.target_waypoint_test = Point()
self.test_target_waypoint_pub = self.create_publisher(Point, '/guardian/target_gps', 10)
def gps_callback(self, msg):
self.curr_lat = msg.latitude
self.curr_lon = msg.longitude
def imu_callback(self, msg):
orientation_q = msg.orientation
quaternion = (orientation_q.x, orientation_q.y, orientation_q.z, orientation_q.w)
roll, pitch, yaw = euler_from_quaternion(quaternion)
yaw_degrees = math.degrees(yaw)
self.yaw_deg = imu_to_compass_heading(yaw_degrees)
if (self.target_lat != -1) & (self.target_lon != -1) & (self.curr_lat != -1) & (self.curr_lon != -1):
target_bearing = calculate_bearing(self.curr_lat, self.curr_lon, self.target_lat, self.target_lon)
self.distance = calculate_distance(self.curr_lat, self.curr_lon, self.target_lat, self.target_lon)
if self.yaw_deg != -999:
self.heading_error = compute_heading_error(target_bearing, self.yaw_deg)
self.get_logger().info(f"Yaw:{self.yaw_deg:.2f}, Azim:{target_bearing:.2f}, Err:{self.heading_error:.2f}")
self.heading_error_pub.publish(Float32(data=self.heading_error))
self.distance_pub.publish(Float32(data=self.distance))
def target_gps_callback(self, msg):
self.target_lat = msg.x
self.target_lon = msg.y
def main(args=None):
rclpy.init(args=args)
node = GuardianNavNode()
rclpy.spin(node)
node.destroy_node()
rclpy.shutdown()
if __name__ == "__main__":
main()
camera_detect_node.py
import rclpy
from rclpy.node import Node
from sensor_msgs.msg import Image
from std_msgs.msg import Bool
from cv_bridge import CvBridge
import cv2
import numpy
import torch
from ultralytics import YOLO
from ament_index_python.packages import get_package_share_directory
import os
class FireDetectorNode(Node):
def __init__(self):
super().__init__('fire_detector_node')
self.subscription = self.create_subscription(
Image,
'/oak/rgb/image_raw',
self.image_callback,
10)
self.bridge = CvBridge()
model_path = os.path.join(
get_package_share_directory('guardian_control'),
'models',
'fire_epoch_50_batch_6.engine'
)
self.model = YOLO(model_path)
self.last_log = self.get_clock().now()
self.get_logger().info("Fire Detector Node Started")
self.isFire = Bool()
self.isFire.data = False
self.isFire_pub = self.create_publisher(Bool, '/guardian/isFire', 10)
def image_callback(self, msg):
try:
frame = self.bridge.imgmsg_to_cv2(msg, desired_encoding='bgr8')
flipped = cv2.flip(frame, 0)
results = self.model(flipped, verbose=False)
for r in results:
if r.boxes is not None:
for box in r.boxes:
xyxy = box.xyxy[0].cpu().numpy().astype(int)
conf = box.conf[0].item()
if conf >= 0.50:
cls_id = int(box.cls[0].item())
label = self.model.names[cls_id] if hasattr(self.model, 'names') else str(cls_id)
cv2.rectangle(flipped, (xyxy[0], xyxy[1]), (xyxy[2], xyxy[3]), (0, 255, 0), 2)
cv2.putText(flipped, f"{label} {conf:.2f}", (xyxy[0], xyxy[1] - 10),
cv2.FONT_HERSHEY_SIMPLEX, 0.5, (0, 255, 0), 2)
self.isFire.data = True
self.isFire_pub.publish(self.isFire)
self.get_logger().info(f"Confidence: {conf:.2f}")
else:
self.isFire.data = False
cv2.imshow("Fire Detection", flipped)
cv2.waitKey(1)
except Exception as e:
self.get_logger().error(f"Failed to process image: {e}")
def main(args=None):
rclpy.init(args=args)
node = FireDetectorNode()
rclpy.spin(node)
node.destroy_node()
rclpy.shutdown()
if __name__ == '__main__':
main()
3. Making It Autonomous!Originally, I intended to use the Arduino Giga board since it already has avaialble GPS and IMU libraries for development. (📷SparkFun_u-blox_GNSS_Arduino_Library). However, since I’m not able to obtain one, I utilized my own ESP32 and tinkered with it.
#include <Arduino.h>
#include <Wire.h>
#include <Adafruit_BusIO_Register.h>
#include <Adafruit_Sensor.h>
#include <Adafruit_BNO08x.h>
#include <SparkFun_u-blox_GNSS_Arduino_Library.h>
#include <micro_ros_platformio.h>
#include <rcl/rcl.h>
#include <rclc/rclc.h>
#include <rclc/executor.h>
#include <std_msgs/msg/int32.h>
#include <sensor_msgs/msg/imu.h>
#include <sensor_msgs/msg/nav_sat_status.h>
#include <sensor_msgs/msg/nav_sat_fix.h>
#include "freertos/FreeRTOS.h"
#include "freertos/task.h"
#include "esp_system.h"
#if !defined(MICRO_ROS_TRANSPORT_ARDUINO_SERIAL)
#error This example is only avaliable for Arduino framework with serial transport.
#endif
#define BNO08X_CS A0
#define BNO08X_INT A1
#define BNO08X_RESET A2
Adafruit_BNO08x bno08x(BNO08X_RESET);
sh2_SensorValue_t sensorValue;
#define I2C_CLK_SPEED 400000
#define configSUPPORT_STATIC_ALLOCATION 1
SFE_UBLOX_GNSS myGNSS;
int8_t STATUS_NO_FIX = -1;
int8_t STATUS_FIX = 0;
int8_t STATUS_SBAS_FIX = 1;
int8_t STATUS_GBAS_FIX = 2;
int8_t status;
uint16_t SERVICE_GPS = 1;
uint16_t SERVICE_GLONASS = 2;
uint16_t SERVICE_COMPASS = 4;
uint16_t SERVICE_GALILEO = 8;
uint16_t service;
#define RCCHECK(fn) { rcl_ret_t temp_rc = fn; if((temp_rc != RCL_RET_OK)){return false;}}
#define RCSOFTCHECK(fn) { rcl_ret_t temp_rc = fn; if((temp_rc != RCL_RET_OK)){}}
#define EXECUTE_EVERY_N_MS(MS, X) do { \
static volatile int64_t init = -1; \
if (init == -1) { init = uxr_millis();} \
if (uxr_millis() - init > MS) { X; init = uxr_millis();} \
} while (0)
sensor_msgs__msg__Imu imu_msg;
sensor_msgs__msg__NavSatFix gps_msg;
rcl_publisher_t publisher;
std_msgs__msg__Int32 msg;
rcl_publisher_t imu_pub;
rcl_publisher_t gps_pub;
rclc_executor_t executor;
rclc_support_t support;
rcl_allocator_t allocator;
rcl_node_t node;
rcl_timer_t timer;
const int timeout_ms = 1000;
uint32_t time_ns;
int32_t time_seconds;
enum states {
WAITING_AGENT,
AGENT_AVAILABLE,
AGENT_CONNECTED,
AGENT_DISCONNECTED
} state;
TaskHandle_t Task1;
TaskHandle_t Task2;
static const int RATE_GPS = 1024;
void setReports(void) {
Serial.println("Setting desired reports");
if (!bno08x.enableReport(SH2_ACCELEROMETER)) {
Serial.println("Could not enable accelerometer");
}
if (!bno08x.enableReport(SH2_GYROSCOPE_CALIBRATED)) {
Serial.println("Could not enable gyroscope");
}
if (!bno08x.enableReport(SH2_ARVR_STABILIZED_RV)) {
Serial.println("Could not enable stabilized remote vector");
}
if (!bno08x.enableReport(SH2_GRAVITY)) {
Serial.println("Could not enable gravity");
}
}
void error_loop() {
while(1) {
delay(100);
}
}
void timer_callback(rcl_timer_t * timer, int64_t last_call_time) {
(void) last_call_time;
(void) timer;
RCLC_UNUSED(last_call_time);
RCSOFTCHECK(rmw_uros_sync_session(1000));
if (rmw_uros_epoch_synchronized) {
time_seconds = rmw_uros_epoch_millis()/1000 + 37;
time_ns = rmw_uros_epoch_nanos();
if (timer != NULL) {
RCSOFTCHECK(rcl_publish(&imu_pub, &imu_msg, NULL));
RCSOFTCHECK(rcl_publish(&gps_pub, &gps_msg, NULL));
}
RCSOFTCHECK(rcl_publish(&publisher, &msg, NULL));
printf("UNIX time: %ld ms\n", time_seconds);
}
}
bool create_entities() {
allocator = rcl_get_default_allocator();
RCCHECK(rclc_support_init(&support, 0, NULL, &allocator));
RCCHECK(rclc_node_init_default(&node, "micro_ros_platformio_node", "", &support));
RCCHECK(rclc_publisher_init_best_effort(&imu_pub, &node, ROSIDL_GET_MSG_TYPE_SUPPORT(sensor_msgs, msg, Imu), "shield/imu"));
RCCHECK(rclc_publisher_init_best_effort(&gps_pub, &node, ROSIDL_GET_MSG_TYPE_SUPPORT(sensor_msgs, msg, NavSatFix), "shield/gps"));
const unsigned int timer_timeout = .1;
RCCHECK(rclc_timer_init_default(&timer, &support, RCL_MS_TO_NS(timer_timeout), timer_callback));
RCCHECK(rclc_executor_init(&executor, &support.context, 2, &allocator));
RCCHECK(rclc_executor_add_timer(&executor, &timer));
return true;
}
void destroy_entities() {
rmw_context_t * rmw_context = rcl_context_get_rmw_context(&support.context);
(void) rmw_uros_set_context_entity_destroy_session_timeout(rmw_context, 0);
rcl_publisher_fini(&publisher, &node);
rcl_timer_fini(&timer);
rclc_executor_fini(&executor);
rcl_node_fini(&node);
rclc_support_fini(&support);
}
void setup() {
Wire.begin();
Serial.begin(115200);
set_microros_serial_transports(Serial);
delay(200);
xTaskCreatePinnedToCore(update_GPS_data, "update__data", 8912, NULL, 1, &Task1, 1);
delay(100);
myGNSS.setI2COutput(COM_TYPE_UBX);
myGNSS.saveConfigSelective(VAL_CFG_SUBSEC_IOPORT);
myGNSS.setNavigationFrequency(1);
myGNSS.setAutoPVT(true);
myGNSS.saveConfiguration();
if (!bno08x.begin_SPI(BNO08X_CS, BNO08X_INT)) {
// Serial.println("Failed to find BNO08x chip");
}
setReports();
state = WAITING_AGENT;
}
void update_GPS_data(void * pvParameters) {
if (myGNSS.begin() == false) {
Serial.println("Failed to find GPS module");
}
const TickType_t xDelay = 500 / portTICK_PERIOD_MS;
for(;;) {
gps_msg.latitude = myGNSS.getLatitude()/10000000.;
gps_msg.longitude = myGNSS.getLongitude()/10000000.;
gps_msg.altitude = myGNSS.getAltitude()/1000.;
gps_msg.status.status = (myGNSS.getSIV() == 0) ? STATUS_NO_FIX : STATUS_SBAS_FIX;
gps_msg.header.stamp.sec = time_seconds;
gps_msg.header.stamp.nanosec = time_ns;
gps_msg.header.frame_id.data = "shield_gps_frame";
gps_msg.position_covariance[0] = myGNSS.getHorizontalAccuracy();
gps_msg.position_covariance[4] = myGNSS.getHorizontalAccuracy();
gps_msg.position_covariance[8] = myGNSS.getVerticalAccuracy();
}
}
void loop() {
if (!bno08x.getSensorEvent(&sensorValue)) {}
switch(sensorValue.sensorId) {
case SH2_GYROSCOPE_CALIBRATED:
imu_msg.angular_velocity.x = sensorValue.un.gyroscope.x;
imu_msg.angular_velocity.y = sensorValue.un.gyroscope.y;
imu_msg.angular_velocity.z = sensorValue.un.gyroscope.z;
break;
case SH2_ARVR_STABILIZED_RV:
imu_msg.orientation.x = sensorValue.un.arvrStabilizedRV.i;
imu_msg.orientation.y = sensorValue.un.arvrStabilizedRV.j;
imu_msg.orientation.z = sensorValue.un.arvrStabilizedRV.k;
imu_msg.orientation.w = sensorValue.un.arvrStabilizedRV.real;
break;
case SH2_ACCELEROMETER:
imu_msg.linear_acceleration.x = sensorValue.un.accelerometer.x;
imu_msg.linear_acceleration.y = sensorValue.un.accelerometer.y;
imu_msg.linear_acceleration.z = sensorValue.un.accelerometer.z;
break;
}
imu_msg.header.stamp.sec = time_seconds;
imu_msg.header.stamp.nanosec = time_ns;
imu_msg.header.frame_id.data = "shield_imu_frame";
imu_msg.angular_velocity_covariance[0] = .5;
imu_msg.angular_velocity_covariance[4] = .5;
imu_msg.angular_velocity_covariance[8] = .5;
imu_msg.orientation_covariance[0] = .5;
imu_msg.orientation_covariance[4] = .5;
imu_msg.orientation_covariance[8] = .5;
imu_msg.linear_acceleration_covariance[0] = 2;
imu_msg.linear_acceleration_covariance[4] = 2;
imu_msg.linear_acceleration_covariance[8] = 2;
switch (state) {
case WAITING_AGENT:
EXECUTE_EVERY_N_MS(500, state = (RMW_RET_OK == rmw_uros_ping_agent(100, 1)) ? AGENT_AVAILABLE : WAITING_AGENT;);
break;
case AGENT_AVAILABLE:
state = (true == create_entities()) ? AGENT_CONNECTED : WAITING_AGENT;
if (state == WAITING_AGENT) {
destroy_entities();
};
break;
case AGENT_CONNECTED:
EXECUTE_EVERY_N_MS(50, state = (RMW_RET_OK == rmw_uros_ping_agent(100, 1)) ? AGENT_CONNECTED : AGENT_DISCONNECTED;);
if (state == AGENT_CONNECTED) {
rclc_executor_spin_some(&executor, RCL_MS_TO_NS(100));
}
break;
case AGENT_DISCONNECTED:
destroy_entities();
state = WAITING_AGENT;
break;
default:
break;
}
}
In short, it sends live GNSS and IMU data via the ESP32 onto our main brain (Jetson Orin Nano), which we will utilize to tell the robot how to get to the assigned waypoints.
3.1 Waypoint MissionIn order for the robot to understand where to go from its current position to the next waypoint. It needs to know where it’s at currently (Current latitude, and longitude), and what’s the robot’s orientation on Earth (Azimuth).
(azimuth is a more specific, absolute direction, while bearing can be relative to a point or line)
By having these measurements from the GPS/IMU sensors. We simply draw a imaginary line from the Guardian’s current position to the waypoint position. This line in comparison to the Guardian’s current heading tells us how much we need to turn to face the waypoint, then we simply command it to move forward until we reach to the waypoint position (when the Guardian’s Latitude, Longitude = the Waypoint’s Latitude, Longitude).
You can reference the exact code from the guardian_nav_node.py
posted earlier.
I originally thought I could utilize the Nordic nrf9151-DK and run edge impulse for visual classification (detecting fire on edge device), but upon further research, it seems like the edge impulse is run via cloud which I could not find further information on how to implement it. And with some time constraint, I decided to run it on the Jetson Orin.
The goal of utilizing AI is to identify hazards like smoke and fire. Thankfully the YOLO and RoboFlow community already has thousands of datasets we can train our AI model to recognize and deploy.
YOLO (You Only Look Once) is a family of object detection models known for their speed and accuracy. They are single-shot detectors, meaning they process an entire image at once to identify and locate objects. YOLO uses a convolutional neural network (CNN) to make predictions about object bounding boxes and class probabilities.
First we need to install Ultralytics librairy that has YOLO11 by following the guide here:
https://docs.ultralytics.com/guides/nvidia-jetson/
Roboflow is a platform for visual AI tools, it streamline model deployment and enhance performance for specific use cases. We will be utilizing the available dataset labeled by the community.
We will then create an account on RoboFlow, then download the dataset here:
https://universe.roboflow.com/sayed-gamall/fire-smoke-detection-yolov11/dataset/2
This data set is composed of images with fire and smoke and are labeled for training purposes.
Then, we will go to the dataset and select download dataset.
Unzip the dataset, you should see data.yaml
and other testtrainvliad
folders in there
Configure the data.yaml
have the first 3 line to hardcode the global path. (You can find the global path by running pwd
on the terminal of that folder)
To something like:
train: /home/<username>/Downloads/dataset_fire_smoke_yolo11/train/images
val: /home/<username>/Downloads/dataset_fire_smoke_yolo11/valid/images
test: /home/<username>/Downloads/dataset_fire_smoke_yolo11/test/images
nc: 2
names: ['Fire', 'Smoke']
roboflow:
workspace: sayed-gamal-kn2wu
project: fire-smoke-detection-yolov11
version: 2
license: CC BY 4.0
url: https://universe.roboflow.com/sayed-gamal-kn2wu/fire-smoke-detection-yolov11/dataset/2
Then we’ll begin training via YoLo 11
# Go back to the root directory of the dataset folder, or outside yolo detect train dataset=/dataset_fire_smoke_yolo11/data.yaml epoch=5 batch=4 imgsz=640 # batch 4 to run on Jetson, need to fine tune this
For my case, I found some balance between speed and accuracy with these values:
yolo detect train data=dataset_fire_smoke_yolo11/data.yaml model=yolo11n.pt epochs=5 batch=4 imgsz=640
4.1 Migrating to nRF9151-DKThe Nordic's nRF9151-DK is a super upgrade in comparison to the Boron 404x. It also alraedy includes an on-board GNSS, as well as plethora of GPIO and core access making it robust and powerful for various applications. However, it was quite challenging to set it up.
Here is my development notes on the nRF9151-DK if you'd like to follow along:
https://www.notion.so/nRF9151-DK-Dev-Note-1f537fe51a828091b5dddc9a508978ef?pvs=4
4.2 Particle Boron 404x vs Nordic nRF9151-DKWhen comparing the two hardware, the Boron comes at an ease with setting up and already available Arduino-style development. The Particle Console was great to monitor the device anywhere I like, so I'd always know where the Guardian is!
On the other hand, the nRF9151-DK was challenging to develop and a lot less off-the-shelf functions, but it is definitely very powerful if time was permitted.
5. Future Work:Unforetunately, the two months of development have flown by and I wish I have more time to complete it and make it super robust.
Here are a few things I'd like to add:
- Sensor fusion
- Web interface
- Better motors
- Add Bluetooth Capability
- Robot Arm!!
I would love to add a Robot Arm so that the Guardian could use tools in the future!
If you enjoyed this project, I would love to get your support to continue building this!
You can buy me some coffee here ☕️
Comments