Illegal logging and poaching often happen far from power and cellular coverage. Estimates put illegal logging at 15-30% of the global timber trade. Rangers may be responsible for patrolling hundreds of square kilometers with limited comms, while a single chainsaw can drop a mature tree in minutes, by the time a report is relayed, the actors are often gone.
This build shows a practical approach to tackle this problem, a compact solar powered acoustic node that listens locally, decides locally with a small Edge Impulse model, and only transmits a short alert plus a still image when something important occurs.
To run unattended, the design emphasizes low idle draw and self‑sufficiency from a small 1 W solar panel and 18650 cell, so it can harvest enough energy to operate continuously.
Because field devices are at risk of tampering or theft, the node also includes simple tamper sensing (SW‑420 vibration switch) to flag disturbance events alongside acoustic detections. The emphasis is on parts that are easy to source, straightforward to assemble, and simple to service in the field.
OverviewThis project builds a solar-powered acoustic watcher that detects forest threats using an Edge Impulse model running on a Seeed XIAO ESP32S3 Sense. The device listens for chainsaws, vehicles, and gunshots, debounces detections, snaps a still image, logs to microSD with timestamped metadata, and publishes the event via MQTT to Adafruit IO. It also flags tamper events via a simple vibration switch. Power is provided by a small solar charger and 18650 cell feeding a 3.3 V LDO. The prototype uses Wi‑Fi for simplicity. A production version would migrate to LoRaWAN for range and lower idle power.
All design files (KiCad project, PCB Gerbers, STL, Arduino firmware, Edge Impulse export ZIP) are attached to this Hackster project page on the Attachments section.
Difficulty: Intermediate
Build time: 5-7 hours
Skills: Soldering, Arduino/ESP32, basic 3D printing
Cost: ~$23.37 per node
in parts, excluding tools
FeaturesThe node runs a four class Edge Impulse model on a Seeed XIAO ESP32S3 Sense to listen for chainsaws, vehicles, and gunshots, with a simple SW‑420 vibration switch for tamper indication. On a confirmed detection it captures an 800 × 600 JPEG, stores it to the microSD with a timestamped filename, and publishes a compact JSON message to Adafruit IO over Wi‑Fi with TLS. Power comes from a CN3791 solar charger feeding an 18650 cell and a 3.3 V LDO, and the whole unit mounts to a tree in a printed enclosure.
This approach favors privacy and practicality. Audio data never leave the device, only labeled events do. The hardware is inexpensive, the firmware is small, and it is easy to replicate for pilots or classroom builds.
Bill of materials (BOM)Target unit cost (order quantity ≥ 5): ~$23.37Schematic and wiring
- Solar panel → CN3791 SOLAR±
- 18650 battery → CN3791 BAT±
- CN3791 SYS OUT± → PCB J_SYS_IN → VBAT_IN/GND
- VBAT_IN → AP2112K‑3.3 → 3V3_REGOUT → JP_EXT3V3_EN → 3V3 → XIAO 3V3
- XIAO 5V: not used; VUSB: test pad only
- SW‑420 header: VCC→3V3, DO→D1, GND→GND, local 0.1 µF decoupling
- Power connector: 2‑pin screw terminal (5.08 mm) to J_SYS_IN
- Expansion: Grove I²C connector broken out for future sensors
Pin map:
SW-420 - D1 - Tamper indication
LED - D2 - Status LED
Buzzer - D3 - Event tone
Make sure the 3V3 shunt jumper is removed when connecting power (or programming) through USBEdge AI pipeline using Edge Impulse Studio
Training used the FSC22 dataset (2, 025 clips). In Edge Impulse Studio the audio is resampled from 44.1 kHz to 16 kHz and cropped to a 2 s window. This dataset is specifically designed for detecting illegal forest activities.
Four primary classes are selected from this dataset: Chainsaw (illegal logging detection), Vehicle (unauthorized access combining ATV, truck, and motorcycle sounds), Gunshot (poaching detection including explosions), and Forest Ambience (natural background sounds for false positive reduction).
The FSC22 dataset is available under a CC-BY license, enabling commercial use with proper attribution.
FSC22 Dataset: https://www.kaggle.com/datasets/irmiot22/fsc22-dataset
Dataset Attribution:
M. Bandara, R. Jayasundara, I. Ariyarathne, D. Meedeniya, and C. Perera, "FSC22 Dataset, " IEEE Dataport, 2022, doi: 10.21227/40ds-0z76.
A Mel‑filterbank energy (MFE) block feeds a small CNN that compiles to TensorFlow Lite Micro for the ESP32S3. The exported Arduino library runs a sliding window on device. Detections are confirmed with a simple threshold and N‑of‑M debounce before any publish or photo (this step is done in the Arduino code later explained in the next section).
On this build, the validation accuracy is 77.1% and the test accuracy is 74.1%. Feature extraction takes about 301 ms per window and inference about 7 ms. Peak RAM usage is roughly 8.5 KB and flash about 32.4 KB.
For sensible defaults, the firmware uses a score threshold of 0.85, requires 2 of the last 3 windows to agree, and applies a 5 s cooldown after a confirmed event (all these values can be easily modified and re-calibrated based on the situation).
Edge Impulse Public project link: https://studio.edgeimpulse.com/public/785094/live
Arduino code outlineThe firmware is organized around four clean pieces:
1. I2S audio path using XIAO ESP32S3 Sense microphone to the model.
2. Small debounce ring with a cooldown.
3. Timestamping used for both payloads and filenames.
4. per‑class feed mapper for Adafruit IO.
Together they keep the main loop simple: capture → classify → confirm → snapshot/log/publish.
1. I2S audio → Edge ImpulseAudio from the Sense PDM mic is read via ESP_I2S using the XIAO pin map and exposed to Edge Impulse through a signal_t
accessor. This avoids intermediate buffers and lets the EI SDK pull floats on demand.
static void i2sInit() {
// XIAO ESP32S3 Sense mic pins: BCLK=42, WS=41
I2S.setAllPins(-1, 42, 41, -1, -1); // MCLK, BCLK, WS, DOUT, DIN
I2S.begin(PDM_MONO_MODE, SAMPLE_RATE, 16);
}
static size_t captureAudioWindow() {
size_t needed = RAW_COUNT, got = 0;
while (got < needed) {
int32_t n = I2S.read((char*)(&g_audio[got]), (needed - got) * sizeof(int16_t));
if (n > 0) got += (size_t)n / sizeof(int16_t); else delay(1);
}
return got;
}
static int raw_audio_get_data(size_t offset, size_t length, float *out_ptr) {
if ((offset + length) > RAW_COUNT) return EIDSP_OUT_OF_BOUNDS;
for (size_t i = 0; i < length; i++) out_ptr[i] = (float)g_audio[offset + i] / 32768.0f;
return EIDSP_OK;
}
In loop()
the code captures a 2 s window, builds signal_t
with raw_audio_get_data
, runs run_classifier
, and picks the top target label.
In loop()
the code captures a 2 s window, builds signal_t
with raw_audio_get_data
, runs run_classifier
, and picks the top target label.
A sliding window of the last three results implements 2‑of‑3 confirmation above a score threshold, followed by a 5 s cooldown window to suppress repeats from the same source.
static bool debounced(const char* label, float score) {
if (score < SCORE_THRESH) return false;
strncpy(hits[hitHead].label, label, sizeof(hits[hitHead].label)-1);
hits[hitHead].label[sizeof(hits[hitHead].label)-1] = 0;
hits[hitHead].score = score;
hitHead = (hitHead + 1) % M_WINDOW;
int cnt = 0;
for (int i = 0; i < M_WINDOW; i++)
if (!strcmp(hits[i].label, label) && hits[i].score >= SCORE_THRESH) cnt++;
return cnt >= N_HITS;
}
Tamper is handled independently: if the SW‑420 is active, the event is published immediately as class="tamper"
regardless of the classifier.
Boot uses NTP to establish UTC, then filenames and JSON use the same clock source. If time is unavailable, filenames fall back to a millis‑based prefix to keep logs unique.
static void ntpInit() {
configTime(0, 0, "pool.ntp.org", "time.nist.gov");
for (int i = 0; i < 50; i++) { time_t t = time(nullptr); if (t > 1700000000) { timeValid = true; break; } delay(100); }
}
static String tsIso8601() {
if (timeValid) { time_t nowT = time(nullptr); struct tm t; gmtime_r(&nowT, &t);
char buf[32]; strftime(buf, sizeof(buf), "%Y-%m-%dT%H:%M:%SZ", &t); return String(buf); }
return String("T") + millis();
}
static String captureJpegSVGA() {
camera_fb_t *fb = esp_camera_fb_get(); if (!fb) return String("");
time_t nowT = time(nullptr); struct tm t; bool ok = (nowT > 1500000000) && gmtime_r(&nowT, &t);
char name[32]; if (ok) strftime(name, sizeof(name), "%Y%m%d_%H%M%S.jpg", &t); else snprintf(name, sizeof(name), "T%lu.jpg", millis());
if (!SD.exists("/EVT")) SD.mkdir("/EVT"); String path = String("/EVT/") + name;
File f = SD.open(path, FILE_WRITE); if (f) { f.write(fb->buf, fb->len); f.close(); }
esp_camera_fb_return(fb); return f ? path : String("");
}
Each event also appends a CSV line ts,node_id,class,score,tamper,vbat,img
to /event_log.csv
for offline analysis.
Here is (one of many) candid "selfie" sample photo, accidentally taken by the device when the node flags a "tamper" event while I'm handling the device.
4. Per‑class, per‑node Adafruit IO feedsFeed keys are generated from the active NODE_ID
so the same firmware image runs on multiple nodes. Publishing is a single call that routes JSON to the correct feed and mirrors to CSV on SD.
static String feedKeyFor(const char *label) {
if (!strcmp(label, "status")) return String("status-") + NODE_ID;
return String(label) + "-" + NODE_ID; // e.g. chainsaw-watcher-01
}
static AdafruitIO_Feed* feedFor(const char *label) {
if (!strcmp(label, "chainsaw")) return feed_chainsaw;
if (!strcmp(label, "gunshot")) return feed_gunshot;
if (!strcmp(label, "vehicle")) return feed_vehicle;
if (!strcmp(label, "tamper")) return feed_tamper;
return feed_status;
}
In setup()
the code connects to Adafruit IO, creates feed pointers using feedKeyFor()
, and then runs the main capture/classify/publish loop. The result is maintainable, small, and easy to port to a LoRaWAN transport later.
The firmware uses the Adafruit IO Arduino client (AdafruitIO_WiFi
). On boot it calls io.connect()
and waits for AIO_CONNECTED
, then binds feed pointers for each class and a status heartbeat. TLS is handled by the library to io.adafruit.com:8883
.
Feeds are created programmatically from the node id. feedKeyFor(label)
returns "<label>-<NODE_ID>"
. With NODE_ID="watcher-01"
the device publishes to:
Change NODE_ID
to watcher-02
to publish to the corresponding *-watcher-02
feeds. Adafruit IO auto-creates missing feeds on first publish.
Events are serialized to JSON and saved with feed->save(payload.c_str())
. The payload schema is consistent across feeds:
{
"ts": "2025-09-26T10:23:45Z",
"node_id": "watcher-01",
"class": "chainsaw",
"score": 0.93,
"tamper": 0,
"img": "/EVT/20250926_102345.jpg",
"vbat": 3.92
}
Core calls in the sketch:
// connect and bind feeds
io.connect();
while (io.status() < AIO_CONNECTED) { io.run(); }
feed_chainsaw = io.feed(feedKeyFor("chainsaw").c_str());
feed_gunshot = io.feed(feedKeyFor("gunshot").c_str());
feed_vehicle = io.feed(feedKeyFor("vehicle").c_str());
feed_tamper = io.feed(feedKeyFor("tamper").c_str());
feed_status = io.feed(feedKeyFor("status").c_str());
// build and publish JSON
StaticJsonDocument<320> doc;
doc["ts"] = tsIso8601();
doc["node_id"] = NODE_ID;
doc["class"] = label; // "chainsaw" | "vehicle" | "gunshot" | "tamper" | "status"
doc["score"] = score; // 0..1 (use 1.0 for tamper/status)
doc["tamper"] = tamper ? 1 : 0;
doc["img"] = imgPath; // "" if capture failed
doc["vbat"] = readBattery();
String payload;
serializeJson(doc, payload);
AdafruitIO_Feed* f = feedFor(label);
if (f) f->save(payload.c_str());
Credentials (Wi-Fi SSID/password, Adafruit IO username/key) and NODE_ID
are set in the sketch’s in-line config.
The enclosure consists of two main parts: a base and a top cover. The base holds all electronic components and uses M3 threaded heat inserts in the PLA so the boards can be securely screwed down. The top cover protects the electronics and also carries the solar panel.
Each side of the base includes an embedded M8 nut that accepts the hook for the tree strap. This also creates a simple pivot, allowing the enclosure to be tilted to adjust the camera angle, useful when the node is mounted higher on the tree and needs to point downward.
For prototyping, there is an optional mounting approach: instead of using the M8 nut and tree‑strap hook, the base can attach to a bracket that mounts to an aluminum extrusion, as shown below. I use this variant in the demo video for the indoor test; the outdoor tree test uses the tree‑strap hook.
The CN3791 handles solar MPPT and provides the SYS rail, while an AP2112K‑3.3 regulates the 3V3 logic supply. A service jumper (JP_EXT3V3_EN
) allows the regulated rail to be isolated during maintenance. Continuous listening is feasible by using light sleep between inference windows, which keeps the average current draw down while maintaining responsiveness.
Indicative dailybudget
A 6 V 1 W panel typically delivers 160-180 mA at STC. With roughly four peak‑sun‑hours per day and about 70% system efficiency, that yields approximately 700-900 mAh/day into a 3.7 V cell. Near break‑even with the budget above. Expect around 2.5-3.0 days of autonomy without sun on a 2600 mAh cell and roughly 4-5 days on a 3400 mAh cell.
Upgrades for the next iteration- Migrate to LoRaWAN for longer range and lower power.
- Add sleep scheduling and audio-trigger wake.
- Weather-proof enclosure (at least equivalent to IP65).
- Use multi-microphone array to increase resolution and detection scope.
- Add GPS time sync or RTC for timestamps.
- Expand classes and retrain on field data.
- Field test with more nodes over longer distances.
Imagine a ranger responsible for hundreds of square kilometers. A small device/node on a tree listens locally, runs a 2 s Edge Impulse pipeline, confirms 2-of-3, captures an 800×600 still, logs to microSD, and publishes a danger alert over MQTT. The alert is timestamped and verifiable, so action does not wait for someone to sift hours of audio.
Decisions happen on-device, bandwidth used only when it matters, and hardware you can source and service easily. A small solar panel and a 18650 cell keep the node self-sufficient, the SW-420 tamper input flags disturbance events alongside acoustic detections from the built-in microphone on the XIAO ESP32S3 Sense.
Everything needed to reproduce the result is attached: KiCad, Gerbers, STL, firmware, and the Edge Impulse export. The system is small, measurable, and ready to deploy, with a clear scaling pipeline to longer-range nodes when transitioning beyond Wi-Fi.
WILDLABS.NET conservation technology network engagementlink to post: https://wildlabs.net/discussion/solar-edge-ai-forest-watcher-prototype
Comments