We live in a beautiful state called Kerala, rightfully known as God's Own Country, where the Western Ghats capture monsoon clouds, creating extraordinary biodiversity. From mist-shrouded tea gardens to pristine rainforests where lion-tailed macaques swing through ancient canopies, our land overflows with natural abundance. Fragrant spice gardens perfume the air while intricate backwaters create a liquid paradise, and our coastline glows with bioluminescent waves. Blessed with double monsoons and hosting over 500 endemic species found nowhere else on Earth, Kerala stands as nature's masterpiece – a narrow strip between mountains and sea where every sunrise reveals not a distant paradise, but a present reality written in emerald forests, sapphire waters, and golden beaches.
Yet this paradise faces a growing shadow: frequent animal-human conflicts that claim lives on both sides, as expanding settlements and fragmented forests force these animals to venture into villages searching for food and water, their ancient migration routes now blocked by homes and farms, reminding us that even in God's Own Country, the balance between human progress and nature's needs remains delicately poised.
The mist-covered hills of Wayanad Wildlife Sanctuary in Kerala's Western Ghats tell a story of coexistence under threat. Home to over 900 wild elephants, this UNESCO World Heritage site sits at the intersection of ancient migration routes and expanding human settlements. What should be a model of conservation success has become a battleground where survival instincts clash with agricultural livelihoods.
In 2023 alone, human-elephant conflicts in Kerala resulted in 40 human deaths and the loss of over 80 elephants. The statistics paint a grim picture across the globe:
- Sri Lanka: 70 humans and 300 elephants perish annually in conflicts
- India: Economic losses exceed ₹500 crores yearly due to crop damage
- Kenya: Elephant raids destroy 75% of crops in conflict zones, forcing farmers into poverty
- Thailand: Habitat fragmentation has compressed elephant territories by 80%, intensifying human encounters
Traditional mitigation methods—from electric fences to community watch programs—have proven inadequate for the vast, densely forested terrains where elephants naturally roam. The challenge isn't just tracking these magnificent animals; it's doing so across remote landscapes with no cellular coverage, harsh weather conditions, and power constraints that would defeat conventional IoT solutions.
TrunkLink emerges from this critical need: a robust, field-tested elephant monitoring system that combines cutting-edge Nordic semiconductor technology with long-range LoRa communication, intelligent geofencing, and solar sustainability to create an early warning system that protects both elephants and human communities.
The TrunkLink EcosystemTrunkLink represents a paradigm shift from simple GPS collars to intelligent, networked wildlife monitoring. At its core lies the Nordic Thingy:91 X, a cellular IoT powerhouse that we've enhanced with LoRa communication capabilities to create a hybrid connectivity solution perfect for the Western Ghats' challenging terrain.
Features Of TrunkLink
- Location Tracking: Forest rangers can monitor elephant movements in real-time, even in dense forest areas, using GPS/GNSS technology from the Nordic Thingy 91X combined with LoRaWAN connectivity.
- Community Alert System: Our platform includes a public registration portal where community members can sign up to receive automatic notifications when elephants come within a 5-kilometer radius of their location, helping prevent human-wildlife conflicts.
- Behavioral Analysis: An integrated Edge AI model analyzes elephant movement patterns and identifies concerning behaviors such as rapid movement or running, which may indicate stress or danger to the animals, allowing rangers to respond quickly.
- Virtual Boundary Monitoring: Rangers can establish custom geofences through our web portal and receive instant alerts when elephants cross these virtual boundaries, enabling proactive wildlife management and protection measures.
Let's have a sneak peek at the system architecture of the TrunkLink ecosystem.
Hardware Components:The system centers on a solar-powered collar containing GPS modules, motion sensors, and LoRa transceivers housed in weatherproof casings. The Thingy:91 X device runs an Edge Impulse machine learning model that processes motion sensor data locally to classify elephant behaviors in real-time.
Communication Flow:Location and sensor data are transmitted via the LoRa protocol to strategically positioned gateways across the monitoring area. These gateways relay data packets to a cloud infrastructure built on Firebase database architecture.
Data Processing Pipeline:The Edge Impulse model continuously analyzes accelerometer and gyroscope data to classify elephant movement patterns. When rapid movement or running behaviors are detected, the system triggers immediate emergency alerts. Simultaneously, GPS coordinates are processed against a 5-kilometer radius detection zone around subscribed user locations.
Alert System:Instead of SMS notifications, the system maintains a subscriber database where users register their coordinates. When an elephant enters within 5KM of any subscriber location, automated alerts are generated and transmitted through the cloud platform to registered users' devices.
Backend Architecture:Firebase handles real-time data storage, user subscription management, and geospatial queries. The custom dashboard provides live tracking visualization, historical movement analysis, and geofence boundary management for conservation authorities.
Power and Connectivity:Solar charging maintains continuous operation in remote areas where cellular coverage is limited or nonexistent. Future integration with Non-Terrestrial Networks (NTN) via nRF9151 SiP will enable direct satellite uplink, eliminating dependency on LoRa gateway infrastructure.
The Thingy:91 X isn't just another development board—it's a complete IoT solution engineered for exactly the challenges we face in wildlife conservation:
Multi-Mode GNSS Precision: The integrated GNSS receiver supports GPS, GLONASS, Galileo, and BeiDou constellations, crucial for maintaining lock under dense forest canopies where single-constellation systems fail. Our field tests in Wayanad showed 40% better position accuracy compared to GPS-only solutions.
Nordic nRF9151 SiP Integration: The System-in-Package includes both the cellular modem and application processor, reducing our PCB footprint by 60% compared to discrete solutions—critical for collar weight minimization.
Future-Proof NTN Support: As Non-Terrestrial Network support rolls out, TrunkLink will seamlessly transition from LoRa+cellular to direct satellite connectivity, eliminating infrastructure dependencies.
Power Optimization: Advanced power management with multiple sleep modes enables our 6-month autonomous operation target, even with limited solar charging during monsoon seasons.
Setting Up Nordic Thingy:91 XThe nRF Connect SDK is the core development environment for the Thingy:91 X, bringing together drivers, libraries, and protocol stacks like LTE-M, NB-IoT, GNSS, Bluetooth LE, and more into one platform. Built on the Zephyr RTOS, it ensures cross-device support and continuous updates for future features such as satellite connectivity and power optimization. In short, it’s the key that unlocks the full IoT potential of the Thingy:91 X, turning hardware into a ready-to-deploy solution.
Here are the steps for the SDK installation.
1. Install dependencies
- SEGGER J-Link software – required for programming and debugging.
- nrfutil command-line tool – used for device management and firmware updates.
2. Set up the nRF Connect SDK (NCS)
- Install Visual Studio Code (VS Code).
- Add the nRF Connect for VS Code extension pack.
- Use the extension to install both the toolchain and the SDK version that matches your project requirements.
3. Verify installation
- Connect your Thingy:91 X via USB.
- Run the command below to confirm that the device is detected.
nrfutil device list
- Open the nRF Connect extension in VS Code and ensure that the correct SDK and toolchain are selected
Here are the steps for flashing the code.
1. Create Workspace
- Make a folder near the root (e.g.,
C:\myfw\ncsfund
), avoid long paths/spaces.
2. Create New App in VS Code
- Open nRF Connect Extension → Create a new application → Copy a sample.
- Search blinky, choose
zephyr/samples/basic/blinky
. - Store in
C:\myfw\ncsfund\l1_e2
.
3. Build Configuration
- Click Add Build Configuration under your app.
- Select correct Board target (Thingy:91 X →
thingy91x/nrf9151/ns
). - Build → binaries appear in
/build
.
4. Flash Application
- .Enter the following command to program the application binary to the nRF9151 application core
nrfutil device program --firmware dfu_application.zip --serial-number <Serial number>
For a detailed walk-through with screenshots and troubleshooting tips, check Nordic’s official guide.
LoRa Connectivity IntegrationWe are deploying the Trunklink in the deep forests of Wayanad — an area with limited cellular connectivity. Also, LTE-M and NB-IoT networks are not yet popular in India, so relying on cellular IoT is not an option for our current deployment. To ensure reliable communication in this remote environment, we’ve integrated the LoRa-E5 module (based on the STM32WLE5JC chipset) as our primary long-range, low-power communication link. Looking ahead, the Thingy:91 X remains future-ready. Beyond cellular IoT, it’s designed to support a satellite modem, which will unlock truly global coverage, even in the most remote locations like Wayanad. We’re hopeful that satellite connectivity becomes accessible in India in the near future, bringing seamless, borderless IoT to even the deepest forests.
To connect the external sensors, the Thingy:91 X provides two options: through the I2C pins or via the debug board. We chose the debug board as the connection interface. The LoRa-E5 module was connected by wiring its VCC and GND to 3.3 V and ground, and its RX and TX pins to pins 24 and 25 of the debug probe to enable AT command-based communication.
After establishing the connections, we verified the setup by uploading code to test and confirm that the LoRa module was responding correctly.
For testing purposes, we utilized the built-in coil antenna of the module. However, for the actual implementation, we will switch to a 2 dBi Gain Rubber Duck Antenna, operating in the frequency ranges of 824 – 960 MHz and 1710 – 1980 MHz, to achieve wider coverage. To connect the external antenna, we need to remove the zero-ohm resistor on the board. Additionally, we will use a UFL-to-SMA connector to interface the antenna with the Seed LoRa module, as the module features a UFL antenna port.
To complete the setup, we used a SenseCAP M2 gateway.
This gateway serves as the backbone of the LoRaWAN network, receiving uplink data from the LoRa-E5 and forwarding it to the internet through Ethernet or Wi-Fi. The M2 supports wide-area LoRa coverage and is designed for continuous operation, making it suitable for remote monitoring in places like the Western Ghats. With its reliable concentrator module and strong signal penetration, the SenseCAP M2 ensures stable communication between field devices and the cloud.
Below is a simple guide to setting up your SenseCAP M2 gateway with The Things Network (TTN) console. First off, a huge thanks to the MakerGram community for providing the hardware and supporting the initial setup—your contributions to open-source IoT projects like this are invaluable and make initiatives like SenseCAP accessible to makers worldwide!
SenseCAP M2 has already undergone initial setup (e.g., powering it on, connecting to Wi-Fi/Ethernet, and basic configuration via its web interface), we'll focus on the next steps: registering the gateway and creating an application in the TTN console. This will allow your gateway to forward LoRaWAN packets to TTN for device communication.
1.Log in to TTN Console
- Go to https://console.cloud.thethings.network/
- Select your cluster region
3.Creating aNew Application
- Click Applications → + Create application
- Enter Application ID and name
- Click Create application
4.Add Your Device
- Go to End devices → + Register end device
- Choose manual registration
- Select a matching frequency plan
- Generate DevEUI, AppEUI, and AppKey, which we use later in the code.
- Click Register end device
Dataset
A high-quality dataset is essential for building an accurate elephant behavioral classification system, particularly for wearable devices designed for monitoring wild elephants. Elephant behavior classification systems require data that includes a variety of real-world scenarios, such as different behavioral states and movement patterns, to ensure that the model can reliably distinguish between distinct behaviors critical for conservation efforts. Gathering such a dataset for elephants is highly challenging, as it requires careful adaptation of existing animal behavior research data to elephant-specific movement patterns and conservation needs.
The TrunkLink dataset leverages the comprehensive animal behavior research from A Novel Biomechanical Approach for Animal Behaviour Recognition Using Accelerometers by Chakravarty et al. (2019). The dataset consists of 4 distinct behavioral classes with data collected from 11 recording sessions, representing natural behavioral patterns.
| Code | Behavior | Description
|------|-----------|-------------------------------------------------------
| 1 | Vigilance | Alert scanning; head/ear motions; surroundings check
| 2 | Resting | Minimal movement; standing/lying; sleep
| 3 | Foraging | Browsing/grazing; trunk manipulation; drinking
| 4 | Running | High-speed locomotion; escape/aggression;
Temporal and Spatial Characteristics:
- Sampling Frequency: 100 Hz (10ms intervals) - optimal for capturing elephant movement dynamics
- Episode Duration: 2 seconds (200 samples per episode) - sufficient for elephant behavioral pattern recognition
- Sensor Configuration: Tri-axial accelerometer (X, Y, Z axes) suitable for smart collar deployment
Preparing The Dataset
The original research dataset is stored in MATLAB's proprietary.mat format, requiring specialized extraction techniques adapted for elephant behavioral analysis. To convert it into Edge Impulse-compatible JSON format, we used the following script
import argparse
import csv
import gzip
import hashlib
import hmac
import json
import os
import sys
import time
from typing import Any, Dict, Iterable, List, Optional, Sequence, Tuple
import numpy as np
BEHAVIOUR_LABELS = {
1: "vigilance",
2: "resting",
3: "foraging",
4: "running",
}
# Conversion factors
GRAVITY_MS2 = 9.80665 # Standard gravity in m/s²
def load_mat_dict(path: str, squeeze: bool = True) -> Dict[str, Any]:
"""
Load a .mat file into a dict of {var_name: value}, excluding MATLAB metadata keys.
Tries scipy.io.loadmat first; on NotImplementedError (v7.3), tries mat73.
"""
try:
from scipy.io import loadmat
mdict = loadmat(path, squeeze_me=squeeze, struct_as_record=False)
return {k: v for k, v in mdict.items() if not k.startswith("__")}
except NotImplementedError:
try:
import mat73 # type: ignore
except ImportError:
raise RuntimeError(
"This .mat file appears to be v7.3 (HDF5). Install mat73 to proceed:\n"
" pip install mat73"
)
return mat73.loadmat(path)
def unwrap_singleton(obj: Any) -> Any:
"""
Repeatedly unwraps singleton containers:
- object ndarray with size 1 -> .item()
- list/tuple of length 1 -> [0]
Stops when not a singleton container.
Does not unwrap numeric ndarrays.
"""
while True:
if isinstance(obj, np.ndarray) and obj.dtype == object and obj.size == 1:
obj = obj.item()
continue
if isinstance(obj, (list, tuple)) and len(obj) == 1:
obj = obj[0]
continue
return obj
def as_object_list(obj: Any) -> List[Any]:
"""
Convert MATLAB cell-like containers to Python lists.
- If obj is an object ndarray: tolist() to preserve dimensional nesting.
- If obj is list/tuple: list() it.
- Otherwise treat as a single-element list [obj].
"""
obj = unwrap_singleton(obj)
if isinstance(obj, np.ndarray) and obj.dtype == object:
return obj.tolist()
if isinstance(obj, (list, tuple)):
return list(obj)
return [obj]
def get_behaviours_container(session_entry: Any) -> List[Any]:
"""
Return the list of 4 behaviours for a given session entry,
unwrapping 1x4x1 wrappers as needed.
"""
s = session_entry
# Remove outer singleton wrappers (e.g., 1×4×1 -> 4)
s = unwrap_singleton(s)
# Convert object arrays to lists
if isinstance(s, np.ndarray) and s.dtype == object:
s = s.tolist()
# Unwrap again in case we got [[b1,b2,b3,b4]]
s = unwrap_singleton(s)
# Now s should be a list/tuple with len == 4
if isinstance(s, (list, tuple)) and len(s) == 4:
return list(s)
# Fallback: try flatten once
if isinstance(s, (list, tuple)):
flat: List[Any] = []
for x in s:
x = unwrap_singleton(x)
if isinstance(x, (list, tuple)):
flat.extend(list(x))
else:
flat.append(x)
if len(flat) == 4:
return flat
raise ValueError(
f"Could not extract the 4 behaviours from a session entry. Got type={type(s).__name__}, "
f"length={len(s) if hasattr(s,'__len__') else 'n/a'}, shape={getattr(s,'shape', None)}"
)
def get_bouts_list(behaviour_entry: Any) -> List[Any]:
"""
Return the list of bouts for a behaviour entry, unwrapping singleton
wrappers and object arrays. If a single 200x3 array is found, return [array].
"""
b = behaviour_entry
b = unwrap_singleton(b)
# If object ndarray, convert to nested list (bouts often stored as column cell arrays)
if isinstance(b, np.ndarray) and b.dtype == object:
b = b.tolist()
# If we ended up with a numeric array (single bout), wrap it
if not isinstance(b, (list, tuple)):
return [b]
# Unwrap each element in case of [[bout],[bout],...]
bouts: List[Any] = []
for elt in b:
elt = unwrap_singleton(elt)
bouts.append(elt)
return bouts
def to_2d_numeric(arr_like: Any) -> np.ndarray:
"""
Convert a bout (expected 200x3) to a 2D numeric ndarray.
If shape is 3x200, transpose to 200x3.
"""
a = np.asarray(arr_like)
if a.ndim != 2:
raise ValueError(f"Expected 2D array for a bout, got shape {a.shape}")
if a.shape[1] == 3:
return a
if a.shape[0] == 3 and a.shape[1] != 3:
return a.T
# If neither dimension is 3, it's unexpected for this dataset.
raise ValueError(f"Unexpected bout shape {a.shape}; expected 200x3 (or 3x200)")
def iter_bouts(
var_value: Any,
sessions_filter: Optional[Sequence[int]] = None,
behaviours_filter: Optional[Sequence[int]] = None,
) -> Iterable[Tuple[int, int, str, int, np.ndarray]]:
"""
Iterate over all bouts in the dataset.
Yields tuples:
(rs, b, behaviour_label, bout_index, bout_array_200x3)
rs and b are 1-based indices to match the paper's notation.
"""
# Convert top-level to a list of sessions
sessions = as_object_list(var_value)
# If top-level came from object ndarray with extra dims, sessions could be nested.
# Ensure we have exactly 11 session entries (the first dimension).
if len(sessions) == 11 and not isinstance(sessions[0], (list, tuple, np.ndarray)):
# Already a clean list of session entries
pass
# Iterate sessions (1..11)
for rs_idx, session_entry in enumerate(sessions, start=1):
if sessions_filter and rs_idx not in sessions_filter:
continue
behaviours = get_behaviours_container(session_entry)
for b_idx in range(1, 5):
if behaviours_filter and b_idx not in behaviours_filter:
continue
beh_entry = behaviours[b_idx - 1]
bouts = get_bouts_list(beh_entry)
label = BEHAVIOUR_LABELS.get(b_idx, f"b{b_idx}")
for i, bout in enumerate(bouts):
try:
a = to_2d_numeric(bout)
except Exception as e:
raise RuntimeError(
f"Failed to parse bout rs={rs_idx}, b={b_idx}, i={i}: {e}"
) from e
yield (rs_idx, b_idx, label, i + 1, a) # 1-based bout index for readability
def open_text(path: str):
"""
Open a text file, using gzip if filename ends with .gz
"""
if path.lower().endswith(".gz"):
return gzip.open(path, "wt", newline="", encoding="utf-8")
return open(path, "w", newline="", encoding="utf-8")
def write_long(
out_path: str,
var_value: Any,
sessions_filter: Optional[Sequence[int]],
behaviours_filter: Optional[Sequence[int]],
sep: str,
precision: int,
) -> Tuple[int, int, int]:
"""
Write one row per sample:
session,behaviour_id,behaviour_label,bout_index,sample_index,ax,ay,az
Returns (n_sessions, n_bouts, n_rows).
"""
fmt = f"{{:.{precision}f}}"
n_sessions = 0
n_bouts = 0
n_rows = 0
with open_text(out_path) as f:
w = csv.writer(f, delimiter=sep)
w.writerow(["session", "behaviour_id", "behaviour_label", "bout_index", "sample_index", "ax", "ay", "az"])
last_rs = None
for rs, b, label, i, a in iter_bouts(var_value, sessions_filter, behaviours_filter):
if rs != last_rs:
n_sessions += 1
last_rs = rs
n_bouts += 1
# rows: sample_index 1..N (use 1-based index for readability)
for j in range(a.shape[0]):
n_rows += 1
w.writerow([
rs, b, label, i, j + 1,
fmt.format(float(a[j, 0])),
fmt.format(float(a[j, 1])),
fmt.format(float(a[j, 2])),
])
return n_sessions, n_bouts, n_rows
def write_wide(
out_path: str,
var_value: Any,
sessions_filter: Optional[Sequence[int]],
behaviours_filter: Optional[Sequence[int]],
sep: str,
precision: int,
) -> Tuple[int, int]:
"""
Write one row per bout:
session,behaviour_id,behaviour_label,bout_index, ax_000..ax_199, ay_000..ay_199, az_000..az_199
Returns (n_bouts, n_cols).
"""
fmt = f"{{:.{precision}f}}"
n_bouts = 0
# We don't know the sample length until we see the first bout; the paper says 200.
# We'll detect from the first encountered bout.
first_bout = None
for rs, b, label, i, a in iter_bouts(var_value, sessions_filter, behaviours_filter):
first_bout = (rs, b, label, i, a)
break
if first_bout is None:
# No data to write
with open_text(out_path) as f:
w = csv.writer(f, delimiter=sep)
w.writerow(["session", "behaviour_id", "behaviour_label", "bout_index"])
return 0, 4
n_samples = first_bout[4].shape[0]
# Build header: session metadata + 3*n_samples columns
header = ["session", "behaviour_id", "behaviour_label", "bout_index"]
header += [f"ax_{k:03d}" for k in range(n_samples)]
header += [f"ay_{k:03d}" for k in range(n_samples)]
header += [f"az_{k:03d}" for k in range(n_samples)]
with open_text(out_path) as f:
w = csv.writer(f, delimiter=sep)
w.writerow(header)
# Write the first bout row
rs, b, label, i, a = first_bout
row = [rs, b, label, i]
row += [fmt.format(float(x)) for x in a[:, 0]]
row += [fmt.format(float(x)) for x in a[:, 1]]
row += [fmt.format(float(x)) for x in a[:, 2]]
w.writerow(row)
n_bouts += 1
# Write remaining bouts
for rs, b, label, i, a in iter_bouts(var_value, sessions_filter, behaviours_filter):
row = [rs, b, label, i]
if a.shape[0] != n_samples:
raise ValueError(
f"Inconsistent bout length: expected {n_samples}, got {a.shape[0]} "
f"(rs={rs}, b={b}, i={i})"
)
row += [fmt.format(float(x)) for x in a[:, 0]]
row += [fmt.format(float(x)) for x in a[:, 1]]
row += [fmt.format(float(x)) for x in a[:, 2]]
w.writerow(row)
n_bouts += 1
return n_bouts, len(header)
def create_edge_impulse_json(
session: int,
bout_data: np.ndarray,
hmac_key: str,
interval_ms: int = 10, # 100 Hz = 10ms intervals
convert_to_ms2: bool = True,
scale_factor: float = 1.0,
) -> dict:
"""
Create Edge Impulse JSON format for a single bout.
Args:
session: Session number
bout_data: 200x3 numpy array of accelerometer data
hmac_key: HMAC key for signing
interval_ms: Sampling interval in milliseconds
convert_to_ms2: Whether to convert data to m/s²
scale_factor: Scaling factor if data needs conversion (e.g., from g to m/s²)
"""
values = []
for row in bout_data:
ax, ay, az = float(row[0]), float(row[1]), float(row[2])
if convert_to_ms2:
# Apply conversion factor (e.g., if data is in g units, multiply by 9.80665)
ax *= scale_factor
ay *= scale_factor
az *= scale_factor
values.append([ax, ay, az])
data = {
"protected": {
"ver": "v1",
"alg": "HS256",
"iat": int(time.time())
},
"signature": ''.join(['0'] * 64), # Placeholder
"payload": {
"device_name": f"session_{session:02d}",
"device_type": "accelerometer",
"interval_ms": interval_ms,
"sensors": [
{"name": "ax", "units": "m/s2"},
{"name": "ay", "units": "m/s2"},
{"name": "az", "units": "m/s2"}
],
"values": values
}
}
# Sign the message
encoded = json.dumps(data)
signature = hmac.new(hmac_key.encode('utf-8'), msg=encoded.encode('utf-8'), digestmod=hashlib.sha256).hexdigest()
data['signature'] = signature
return data
def write_timeseries(
out_dir: str,
var_value: Any,
sessions_filter: Optional[Sequence[int]],
behaviours_filter: Optional[Sequence[int]],
hmac_key: str,
interval_ms: int = 10,
convert_to_ms2: bool = True,
scale_factor: float = 1.0,
) -> Tuple[int, int]:
"""
Write individual JSON files for each bout, formatted for Edge Impulse.
Creates separate folders for each behavior type.
Returns (n_bouts, n_files).
Args:
out_dir: Base output directory
var_value: MATLAB data structure
sessions_filter: Optional session numbers to include
behaviours_filter: Optional behavior numbers to include
hmac_key: HMAC key for signing
interval_ms: Sampling interval in milliseconds
convert_to_ms2: Whether to convert data to m/s²
scale_factor: Scaling factor for conversion (use 9.80665 if data is in g)
"""
# Create base output directory
os.makedirs(out_dir, exist_ok=True)
# Create subdirectories for each behavior
behavior_dirs = {}
for behavior_id, behavior_name in BEHAVIOUR_LABELS.items():
behavior_dir = os.path.join(out_dir, behavior_name)
os.makedirs(behavior_dir, exist_ok=True)
behavior_dirs[behavior_name] = behavior_dir
n_bouts = 0
n_files = 0
for rs, b, label, i, a in iter_bouts(var_value, sessions_filter, behaviours_filter):
n_bouts += 1
# Get the behavior-specific directory
behavior_dir = behavior_dirs[label]
# Create filename: {behaviour_label}_session_{rs:02d}_bout_{i:03d}.json
filename = f"{label}_session_{rs:02d}_bout_{i:03d}.json"
filepath = os.path.join(behavior_dir, filename)
# Create Edge Impulse JSON data
json_data = create_edge_impulse_json(
rs, a, hmac_key, interval_ms, convert_to_ms2, scale_factor
)
# Write JSON file
with open(filepath, 'w') as f:
json.dump(json_data, f, indent=2)
n_files += 1
return n_bouts, n_files
def parse_ints_list(spec: Optional[str]) -> Optional[List[int]]:
if not spec:
return None
items = []
for token in spec.split(","):
token = token.strip()
if not token:
continue
try:
items.append(int(token))
except ValueError:
raise argparse.ArgumentTypeError(f"Invalid integer in list: {token!r}")
return items or None
def main():
ap = argparse.ArgumentParser(
description="Convert the study's sessionWiseAccData_fourBehaviours .mat file to CSV."
)
ap.add_argument("input", help="Path to input .mat file")
ap.add_argument("-v", "--var", default="sessionWiseAccData_fourBehaviours",
help="Name of the variable in the .mat file (default: sessionWiseAccData_fourBehaviours)")
ap.add_argument("-m", "--mode", choices=["long", "wide", "timeseries"], default="long",
help="Export mode: 'long' (rows=samples), 'wide' (rows=bouts), or 'timeseries' (JSON files for Edge Impulse). Default: long")
ap.add_argument("-o", "--out", help="Output path: CSV file for long/wide modes, directory for timeseries mode")
ap.add_argument("--sep", default=",", help="CSV delimiter (default ',')")
ap.add_argument("--precision", type=int, default=6, help="Decimal precision for floats (default 6)")
ap.add_argument("--sessions", type=parse_ints_list,
help="Comma-separated subset of session numbers (1..11), e.g., 1,2,5")
ap.add_argument("--behaviours", type=parse_ints_list,
help="Comma-separated subset of behaviour numbers (1..4), e.g., 1,3")
ap.add_argument("--hmac-key", default="default_key",
help="HMAC key for Edge Impulse JSON signing (required for timeseries mode)")
ap.add_argument("--interval-ms", type=int, default=10,
help="Sampling interval in milliseconds for timeseries mode (default: 10ms for 100Hz)")
ap.add_argument("--convert-to-ms2", action="store_true", default=True,
help="Convert accelerometer data to m/s² (default: True)")
ap.add_argument("--scale-factor", type=float, default=1.0,
help="Scaling factor for conversion to m/s² (use 9.80665 if data is in g units, default: 1.0)")
args = ap.parse_args()
if not os.path.isfile(args.input):
print(f"Error: file not found: {args.input}", file=sys.stderr)
sys.exit(1)
try:
mdict = load_mat_dict(args.input, squeeze=True)
except Exception as e:
print(f"Failed to load .mat file: {e}", file=sys.stderr)
sys.exit(2)
if args.var not in mdict:
print(f"Variable '{args.var}' not found in {args.input}. Available: {list(mdict.keys())}", file=sys.stderr)
sys.exit(3)
value = mdict[args.var]
# Build output path
in_stem = os.path.splitext(os.path.basename(args.input))[0]
if args.out:
out_path = args.out
else:
if args.mode == "timeseries":
out_path = f"{in_stem}__timeseries/"
else:
suffix = args.mode
out_path = f"{in_stem}__{suffix}.csv"
# Warn about size in long mode
if args.mode == "long" and not out_path.endswith(".gz"):
print("Note: long mode can create a very large CSV (~GB). Consider using .gz (e.g., -o file.csv.gz).", file=sys.stderr)
# Check HMAC key for timeseries mode
if args.mode == "timeseries" and args.hmac_key == "default_key":
print("Warning: Using default HMAC key. Set --hmac-key for production use.", file=sys.stderr)
# Perform export
try:
if args.mode == "long":
n_sessions, n_bouts, n_rows = write_long(
out_path, value, args.sessions, args.behaviours, args.sep, args.precision
)
print(f"Wrote: {out_path}")
print(f"Summary: sessions={n_sessions}, bouts={n_bouts}, rows={n_rows}")
elif args.mode == "wide":
n_bouts, n_cols = write_wide(
out_path, value, args.sessions, args.behaviours, args.sep, args.precision
)
print(f"Wrote: {out_path}")
print(f"Summary: bouts={n_bouts}, columns={n_cols}")
elif args.mode == "timeseries":
n_bouts, n_files = write_timeseries(
out_path, value, args.sessions, args.behaviours,
args.hmac_key, args.interval_ms, args.convert_to_ms2, args.scale_factor
)
print(f"Wrote: {out_path}")
print(f"Summary: bouts={n_bouts}, files={n_files}")
print(f"Data organized in 4 behavior folders: vigilance, resting, foraging, running")
except Exception as e:
print(f"Export failed: {e}", file=sys.stderr)
sys.exit(4)
if __name__ == "__main__":
main()
The above script converts each elephant behavioral episode is converted to Edge Impulse's standardized JSON format:
{
"protected": {
"ver": "v1",
"alg": "HS256",
"iat": 1695908800
},
"signature": "authenticated_hmac_signature",
"payload": {
"device_name": "trunklink_session_01",
"device_type": "elephant_collar",
"interval_ms": 10,
"sensors": [
{ "name": "ax", "units": "m/s2" },
{ "name": "ay", "units": "m/s2" },
{ "name": "az", "units": "m/s2" }
],
"values": [
[ax1, ay1, az1],
[ax2, ay2, az2],
...
[ax200, ay200, az200]
]
}
}
Execute this command to convert raw data into JSON format
python preprocess.py input.mat --mode timeseries -o ./timeseries_data/ --hmac-key <hmac-key> --scale-factor 9.80665
Once the command completes, you'll find a new timeseries_data folder containing all the processed data files. To import this dataset into Edge Impulse, navigate to the Data Acquisition tab and upload the files from this folder.
After uploading the elephant behavioral dataset, perform a Train/Test split, which divides the dataset into training and testing sets in an 80/20 ratio, ensuring balanced representation across all behavioral classes and recording sessions.
Create Impulse
To build an ML model in Edge Impulse for elephant behavioral classification, start by CreatingAn Impulse. This defines the entire pipeline for processing and analyzing elephant accelerometer data from smart collars.
- To create an impulse, navigate to the Impulse Design section in your Edge Impulse project and click Create Impulse to begin setting up the TrunkLink elephant behavioral classification pipeline.
- Click Add a processing block and select Raw Data from the available processing blocks. The Raw Data block processes the raw accelerometer sensor data without pre-processing, allowing the deep learning model to learn features directly from elephant movement patterns.
- Click Add a Learning Block and choose Classification as the learning block. The Classification block learns from the raw accelerometer features and applies this knowledge to classify new elephant behavioral data into one of four classes
- After configuring the processing and learning blocks with elephant-specific parameters, click Save Impulse to finalize the TrunkLink behavioral classification pipeline.
Feature Generation
Proceed to the Raw Data tab to begin the feature generation process specifically designed for elephant behavioral analysis. The Raw Data tab offers various options for data manipulation, including axis scaling and filtering. For the TrunkLink elephant behavioral classification project, we retain the default settings to allow the deep learning model to learn directly from raw elephant movement patterns.
The feature generation process for elephant behavioral classification utilizes advanced algorithms designed to identify key patterns and characteristics within elephant accelerometer data:
- Temporal Patterns: Identification of elephant-specific movement rhythms and gait patterns
- Amplitude Characteristics: Recognition of movement intensity variations across behaviors
- Multi-axis Correlation: Analysis of coordinated movements across X, Y, and Z accelerometer axes
- Behavioral Transitions: Detection of transition patterns between different elephant behaviors
Generate features by clicking Generate features. This process extracts meaningful patterns from the elephant accelerometer data that will be used by the learning block to accurately classify elephant behaviors for conservation applications.
Model Training
Having extracted and prepared features from the elephant behavioral data, proceed to the Classifier tab to begin training the model. The Classifier tab offers various options for model configuration. We have trained the model with default settings.
The trained model achieved an impressive 96% accuracy, demonstrating excellent performance in classifying the four elephant behaviors (vigilance, resting, foraging, and running).
Model Testing
After training and fine-tuning the model, we evaluated its performance on unseen data using the Model Testing tab's Classify All feature. This testing phase validates the model's ability to accurately classify behaviors on new data. The high classification accuracy achieved on the test set demonstrates the model's reliability and readiness for real-world deployment.
Deployment
On the Deployment page, select the "Create Library" option and choose "C++ Library", which will create a general-purpose C++ library compatible with Nordic Thingy 91x.
Edge Impulse lacks native support for the Nordic Thingy 91x, and no readily available firmware was found for this device. To address this limitation, we ported the official Edge Impulse Nordic Thingy 91 libraryto the Nordic Thingy 91x platform, specifically adapting the accelerometer functionality. We validated the successful port by testing our previous Fall Detection For The Elderlyproject on the Nordic Thingy 91x hardware. Detailed documentation of the porting process is available here.
After validating the fall detection implementation, we replaced the existing Edge Impulse model with our current elephant behavior classification model, preserving all hardware-specific modifications and adaptations.
Now rebuild with the new model:
docker run --rm -v $(pwd):/app edge-impulse-nordic-thingy91x west build -b thingy91x/nrf9151/ns
Once the build is successful, flash the new model.
nrfutil device program --serial-number THINGY91X_2BXXXXXXXBF --firmware build/zephyr/app_signed.hex --options target=nRF91
Firebase Real-Time DatabaseFirebase Real-Time Database is a NoSQL database where information is stored in JSON object format. It keeps data synchronized instantly among all connected users, making sure everyone views identical information simultaneously. Its main capabilities are:
- Real-time synchronization: All connected devices receive data updates immediately as changes occur.
- Offline functionality: The Firebase SDKs store data locally on devices, enabling applications to work without internet connectivity and synchronize modifications once the connection is restored.
- Scalability: Firebase RTDB supports high-volume applications serving millions of users effectively.
Setting Up a Firebase Project
1.Create a Firebase Project:
- Go to the Firebase Console.
- Click "Add Project".
- Enter a project name and follow the prompts to create the project.
2.Add Firebase to Your App:
- After creating the project, click on the </>(web) icon to add Firebase to your web app.
- Register your app by providing a nickname.
- Firebase will generate a configuration object containing your API keys and other settings. Keep this handy when initializing Firebase in your app.
Enabling Firebase Realtime Database
1.Navigate to Real-Time Database:
- In the Firebase Console, go to the Build section in the left sidebar and select Realtime Database
2.Create a Database:
- Click "Create Database"
- Choose a location for your database (preferably close to your users for better performance).
- Select "Start in test mode" to allow read/write access to all users temporarily (You can configure security rules later)
3.Database URL
- Once the database is created, Firebase will provide a unique URL for your database in the format:
- This URL is used to reference your database in your app.
4. Firebase Project ID:
The Firebase Project ID is a unique identifier for your Firebase project. It distinguishes your project from others and is required when making API calls or configuring Firebase services.
From the Firebase Console:
- Go to the Firebase Console.
- Select your project
- Click on the gear icon (⚙️) next to "Project Overview" in the sidebar.
- Select "Project settings".
- Under the General tab, you will find the Project ID listed.
From the Firebase Configuration Object:
When you add Firebase to your app, Firebase provides a configuration object. This object contains the projectId field
const firebaseConfig = {
apiKey: "YOUR_API_KEY",
authDomain: "YOUR_AUTH_DOMAIN",
projectId: "YOUR_PROJECT_ID", // This is your Firebase Project ID
storageBucket: "YOUR_STORAGE_BUCKET",
messagingSenderId: "YOUR_MESSAGING_SENDER_ID",
appId: "YOUR_APP_ID"
};
5.Firebase Auth Token (Firebase Database Secret)
The Firebase Database Secret is a legacy authentication mechanism for the Firebase Realtime Database. It is a long, randomly generated string that grants full read and write access to your entire database. It was primarily used for server-side applications or tools that needed unrestricted access to the database.
Go to the Firebase Console:
- Log in to the Firebase Console.
- Select your project.
Navigate to Project Settings:
- Click on the gear icon (⚙️) next to "Project Overview" in the sidebar.
- Select "Project settings".
Access the Database Secret:
- Go to the Service Accounts tab.
- Scroll down to the Database Secrets section
- Click "Show" to reveal the secret. You can also click "Add secret" to generate a new one if needed
- Copy the secret and store it securely. Treat it like a password, as it grants full access to your database.
Database Structure and Data Model
Firebase Real-time Database stores data as a JSON tree. Each node in the tree can contain key-value pairs or nested child nodes.
This is the database structure that we use:
{
"elephants": {
"elephantX": {
"geofence": {
"coordinates": "latitude1,longitude1|latitude2,longitude2|latitude3,longitude3|..."
},
"livelocation": {
"lat": "latitudeX",
"lng": "longitudeX"
},
"locations": {
"uniqueLocationId1": {
"latitude": "latitudeValue1",
"longitude": "longitudeValue1",
"timestamp": "timestampValue1"
},
"...": {
"latitude": "latitudeValueN",
"longitude": "longitudeValueN",
"timestamp": "timestampValueN"
}
}
},
"elephantY": {
"geofence": {
"coordinates": "latitude1,longitude1|latitude2,longitude2|latitude3,longitude3|..."
},
"livelocation": {
"lat": "latitudeY",
"lng": "longitudeY"
},
"locations": {
"uniqueLocationId1": {
"latitude": "latitudeValue1",
"longitude": "longitudeValue1",
"timestamp": "timestampValue1"
},
"...": {
"latitude": "latitudeValueN",
"longitude": "longitudeValueN",
"timestamp": "timestampValueN"
}
}
}
}
}
Web InterfaceThe TrunkLink web portal offers a dual-tier interface, serving both public community members and authorized forest rangers with distinct access levels and functionalities. The platform separates public subscription services from wildlife management operations, ensuring appropriate data access while maintaining security protocols for sensitive tracking information.
Public Subscription Portal
The public-facing portal enables community members to register accounts and subscribe to elephant proximity alerts by sharing their current location through their device's GPS. The client-side application continuously fetches real-time elephant location data from Firebase and performs distance calculations locally on the user's device, comparing their current coordinates against all tracked elephant positions. When any elephant comes within a 5-kilometer radius of the user's current location, the system triggers immediate proximity alerts.
Authorized Ranger Dashboard
The ranger dashboard offers comprehensive wildlife monitoring capabilities, including real-time elephant tracking through interactive GIS mapping, dynamic geofencing tools for creating virtual boundaries around protected areas, and immediate alerts for geofence breaches. Rangers receive prioritized notifications from the Edge AI behavioral analysis system when abnormal motion patterns indicate potential distress or aggressive behavior, along with emergency response coordination tools and comprehensive analytics for monitoring population dynamics, movement patterns, and assessing human-elephant conflict.
The system is powered through a 5V Solar Power Manager paired with six solar panels connected in parallel, each delivering approximately 70 mA of current at 6V. This module is equipped with an MPPT (Maximum Power Point Tracking) feature to optimize solar energy harvesting. It can deliver up to 900 mA of charging current to a 3.7 V Li-ion battery, whether the input comes from a USB charger or the solar panels
Instead of modifying the Thingy:91 X for direct charging, we used an external 1800 mAh battery to keep things simple.
The Solar Power Manager has a USB output, so we can power the Thingy:91 X with a regular USB cable - making the whole setup clean and easy to work with.
This is a simple block diagram that demonstrates the power distribution in the project.
Device DesignThe device features an octagonal form factor engineered to maximize surface area for six solar panels positioned radially around a central point. The housing architecture comprises three main sections: a two-part upper assembly and a bottom compartment that houses the core electronics, including the Thingy:91 X development board, Solar Power Manager module, and battery pack.
The modular upper sections secure to the base compartment via M3 × 15 mm fasteners. The entire structure was fabricated using 3D printing with PLA filament. The bottom layer of the upper part assembly received a spray grey finish to enhance the overall aesthetics.
The military green color palette serves dual purposes: providing effective environmental camouflage for discrete outdoor deployment while delivering a refined, industrial aesthetic.
AssemblyThe assembly procedure begins with securing the core electronics within the bottom compartment. The Thingy:91 X development board and Solar Power Manager are mounted using M3 screws and the 3M double-sided tape. The debug probe has already been connected to the 91X.
After securing the primary boards, the LoRa E5 module and battery pack are positioned within the compartment. The battery pack is placed in close proximity to the Thingy:91 X to minimize power transmission losses, while the LoRa module is positioned near the rubber duck antenna.
Then the rubber duck antenna is connected to the LoRa module by securing it in that hex slot. The rubber duck antenna is then connected to the LoRa module and secured within the designated hexagonal mounting slot to ensure stable positioning and optimal signal transmission.
The six solar panels are then fixed to the grey-painted mounting surfaces using double-sided tape. The panels are wired in a parallel configuration to optimize current output, with all connections routed to the solar charge controller for integrated power management
Finally, all housing components are assembled and secured using M3 x 15 mm screws to create a robust, weatherproof enclosure. A durable fabric belt is attached to the integrated mounting extensions, providing a reliable attachment system for deployment on elephant collars in field applications.
We have successfully deployed TrunkLink on an elephant and verified its functionality through our website. Moving forward, we are looking to collaborate with the Wayanad Wildlife Sanctuary to implement similar tracking collars, further advancing our conservation efforts.
- Satellite Connectivity: Leverage the nRF9151's Non-Terrestrial Network (NTN) capability for direct satellite uplink, eliminating LoRa gateway dependencies and enabling global coverage in the most remote forests.
- Enhanced AI & Multi-Species Support: Predictive analytics for elephant movement forecasting, health monitoring through behavioral analysis, and expansion to other endangered species like tigers, leopards, and sloth bears.
- Mobile Apps & Community Features: Native iOS/Android apps for rangers with offline capability, community reporting portals, and smart non-harmful acoustic deterrent systems.
- Hardware Improvements: Biodegradable collar materials, extended battery life through kinetic energy harvesting, and integrated environmental sensors for water quality and vegetation health monitoring.
Comments