Across the globe, many zoos house wild animals in captivity, including wolves, which require constant monitoring to ensure their physical health and emotional well-being. One of the key indicators of a wolf's mood and social status within its pack is the position of its tail and head. These subtle behavioral cues offer important insights into their mental state and hierarchical dynamics.
However, animals in captivity are removed from their natural environment and instincts. To compensate, it’s critical to provide them with sufficient stimulation and environmental enrichment. Doing so helps preserve their innate behaviors and overall mental health. But enrichment alone is not enough—what’s equally important is having a reliable, continuous monitoring system.
Our idea addresses this need with AIoT-based solution: a real-time tail-position tracking system for wolves. By analyzing the angle and position of a wolf’s tail, we can assess behavioral patterns and health indicators remotely. The data is collected and visualized through a user-friendly app that allows zoo staff, veterinarians, and even remote wildlife experts to monitor the animals 24/7 from anywhere in the world.
For example, a wolf behavior specialist in Germany could easily support zookeepers in Slovenia by accessing the live data and offering expert advice based on tail movement and other cues, all through the app.
The application is designed with multiple users in mind:
Zoo caretakers, for daily monitoring and animal welfare.
Veterinarians, for health assessments and early detection of behavioral abnormalities.
Visitors, who can use the app as an educational tool, learning fun and insightful facts about wolves and their behavior.
Our vision is to blend technology with wildlife care—supporting the well-being of animals in captivity while enabling global collaboration and awareness.
Machine learningSince we didn't have access to a comprehensive dataset of wolf tail positions and corresponding moods, we opted to create our own training data by manually recreating tail movement samples. These motions were based on well-established literature that correlates the positions of a wolf’s tail, ears, and head with specific emotional states and hierarchical roles within the pack.
The image below demonstrates the most widely recognized postures, including variations in tail angle, head position, and ear orientation—each representing different behavioral cues such as dominance, submission, aggression, or fear.
We created a new project on the Edge Impulse platform and uploaded these motion samples as data points for training. Each sample was labeled with the corresponding behavioral class based on the documented body language of wolves. Using Edge Impulse’s MFE (Mel-Frequency Energy) processing block, we extracted relevant features from the tail movement data, which was simulated through sensor input.
With the dataset prepared, we trained a machine learning model to classify different tail positions and associate them with specific behavioral patterns. After training, we evaluated the model’s performance using test samples to assess its accuracy and ability to reliably distinguish between the behavioral categories.
This proof-of-concept serves as a foundational step toward a more advanced behavioral monitoring system.
After training and evaluating our model, we deployed it onto the Arduino Nano 33 BLE. Edge Impulse provides integration support for various microcontrollers and edge devices, allowing us to run our trained model directly on the Arduino Nano 33 BLE.
ConnectivityTo transmit data from the sensor mounted on the wolf's tail to our application, we chose to use LoRaWAN connectivity. LoRaWAN is ideal for this kind of use case due to its long-range, low-power communication capabilities—especially in environments like zoos where Wi-Fi or Bluetooth coverage may be limited. The device on the tail can operate on battery power and still reliably send tail position data to a central gateway, which then forwards it to our cloud-based system and app for real-time monitoring and analysis.
The Arduino sketch used in our project is based on a machine learning example generated through the Edge Impulse platform. It has been customized to include support for LoRaWAN communication, enabling long-range, low-power data transmission from the device mounted on the wolf’s tail.
The sketch starts by including the necessary libraries, such as the LoRaWAN library for communication and the "happydog_inferencing.h"
header file for running the Edge Impulse classifier directly on the device.
Key LoRaWAN parameters—such as device address, network session key, and application session key—are defined and configured to ensure secure data transfer to the LoRa gateway and onward to the cloud infrastructure.
The sketch is structured to run inference continuously on real-time sensor data related to tail position (e.g., from an IMU or flex sensor). Each time the classifier detects a change in the predicted wolf behavior or tail posture, the result is serialized and sent over LoRaWAN to the gateway. From there, it is relayed to our server and made available in the app for zoo caretakers and experts to review.
This setup merges embedded machine learning with LoRaWAN communication, allowing real-time monitoring of wolf behavior in large outdoor enclosures or remote zoo settings where traditional wireless technologies like Wi-Fi or BLE may not be reliable.
Android AppThe primary purpose of the application is to receive and display real-time data about the emotional state of wolves in captivity.
Connection via QR CodeThe user begins by scanning a QR code located near the wolf enclosure. This connects the app to the specific sensor system associated with that group of wolves.
Wolf OverviewAfter connecting, the user is presented with a list of all wolves in the system. Each entry provides a quick overview of the individual wolf, including its name or identifier and a summary of its current behavioral stat
Individual Wolf MonitoringBy selecting a specific wolf from the list, the user gains access to a dedicated view where they can monitor that wolf’s mood in real time, 24/7. This includes data such as tail position, mood classification, and potentially a history of behavior over time.
A demonstration of the working principle is shown in the video below.
Comments