The goal of the project is to develop an open-source, multi-modal platform featuring multiple wearable haptic bands designed to convey information.
Problem IdentificationDance, as a popular art form, relies heavily on visual perception, making it inaccessible for visually impaired individuals. I began to think about how we could help them perceive dance movements and fully engage with the experience through audio. Additionally, I questioned what other modalities or ways could be used to convey these movements, ensuring the experience is seamless and immersive for them.
Build2gether ChallengeInitial Idea I had was to make a computer vision based Haptic wearable to improve and simplify training in dance especially and also choreography, yoga exercise, sports and various art forms.
A solution where in an Lanyard Badge [Primary] device is worn by the user with a tag, which has a camera equipped for computer vision with on board processing system and other peripherals such as BLE/WIFI, Audio IN/OUT, Display, Charging modules, GNSS, IMU, ultrasonic sensors etc.,
Six [Secondary] wearable devices like watch straps with haptic actuators, BLE and charging modules which are slim, low power and weightless to fasten with Velcro strap. These six secondary devices are worn by the user at 6 different points: Wrist (L/R), Arm (L/R) and Legs (L/R). Each device has braille markers on the enclosure to identify each device. They are connected in star network topology to primary device.
If a dance performance is ongoing (either offline or online). Camera on the primary device is faced toward the performance. It detects the dance motions and simplifies it into key body joints and line diagram with edge kinematic pose estimation motion analytics and translates it into corresponding motion patterns with a pre-recorded reference models.
Based on the key points and motion patterns a haptic feedback sequence/pattern is generated with different frequency and amplitude and relayed by the primary device to six secondary devices. When each node crosses a line based on the line number and its X/Y position from origin, a pattern is set.
These haptic patterns when played on the secondary devices at various points, user moves and co-ordinates their arms and legs correspondingly replicating intended dance/choreography pattern. Either this could be real-time or can be recorded and played whenever necessary.
Other use cases of the solution is,
Outdoor Activities: Assist in outdoor activities for navigation and walking guidance with the help of smartphone app GNSS which relays navigational information to the wearables through haptics
Immersive Media Experience: Immersive experience enhancement through haptics while watching dance, music, movies etc., based on the audio, a frequency analysis is done and correspondingly each of six secondary devices is harmoniously played together with audio patterns
SOS Alerts: SOS alert notification with an SOS button and IMU based fall detection. Notification with GPS co-ordinates to the registered phone number
Indoor Activities: Each wearable device can be tagged to things like water bottle, chairs, table etc., for tracking with sound by a press of a button from central device which makes a sound from tagged wearable. With this Indoor tracking of things and navigation also could be made with directional audio
Building the solutionIn the beginning I started focusing more on the camera based computer vision part for pose estimation in Realtime using Raspberry pi, Seeed Studio Grove AI module etc.,. But due to my limited experience and skillsets in software development, the results were not up to the expectation and also hardware I've selected for this purpose needs to be upgraded to higher configuration for faster better and accurate results.
Later started working on an web app which can convert a Audio-Visual file of dance performance to Audio-Haptics file. Concept is, user uploads any AV file of dance performance in formats like MP4, file is processed with pose estimation model runs on it and identifies the key joints and associates to reference frame. If the points make a change below or above the defined threshold values in the frame of reference it outputs a data along X/Y. The intention is to convert that into an PWM value for the ERM motors in the wearables which ranges from 0 to 255.
Example: Left Wrist: 200, Left Arm: 114, Left Leg: 85, Right Wrist: 120, Right Arm: 65, Right Leg: 211
This file will be sent to base station from the PC through USB. Here I was running out of time and had to stop this web app development and start the building of hardware.
For demonstration and prototype purposes, I have made simplified version of base station and only 2 wearable devices (RW and LW)
Components - XIAO ESP32 S3 Sense module with expander module, Notecarrier-A with Notecard cellular, Ultrasonic Sensor HC-SR04, Switch buttons, Battery, M3/M2 brass inserts, M3/M2 bolts, wires, lanyard
3D Printed Files - Base_Bottom.STL, Base_LID.STL, Button.STL
Components - XIAO ESP32-C3, Eccentric Rotating Mass Vibration Motors [ERM], Battery, Switch, LEDs, Velcro Straps, M2 brass inserts, M3/M2 bolts (Schematics attached)
3D Printed Files - Wearable_Bottom.STL, Wearable_LID.STL
PCB is designed with ESP32 C3, LEDs, Buzzer and ERM. There are 2 LEDs which can be customized in the code. One LED indicates together with ERM pulses and another can be used for battery check. Buzzer for battery check and tracking device through sound.
A user defined switch which can be used for wake up from deep sleep, battery status and soft reset of the device in the code
Enclosures have Braille Markings for identification: LW, RW, LA, RA, LL, RL
How it Works?Communication between the Base station and Wearables is through ESP-NOW protocol (Broadcast Mode).
Code for the base station and wearable devices are developed in Arduino IDE.
In the prototype - I have tested for 3 use cases,
- Generic example code
- SOS alerts with notification using Blues-Notecard
- Audio tags for Indoor navigation and locating things
- Audio based immersive experience
Generic example code
An example code of esp-now for broadcasting PWM values to 6 different wearable devices.
Base Code: Reads analog values from channels A0 to A5 and sends those values to wearables
//Generic example code
//Base code sender
//XIAO ESP32-S3
#include <WiFi.h>
#include <esp_now.h>
// Function to format MAC address
void formatMacAddress(const uint8_t *macAddr, char *buffer, int maxLength) {
snprintf(buffer, maxLength, "%02x:%02x:%02x:%02x:%02x:%02x", macAddr[0], macAddr[1], macAddr[2], macAddr[3], macAddr[4], macAddr[5]);
}
// Callback when data is sent
void sentCallback(const uint8_t *macAddr, esp_now_send_status_t status) {
char macStr[18];
formatMacAddress(macAddr, macStr, 18);
//Serial.print("Last Packet Sent to: ");
//Serial.println(macStr);
//Serial.print("Last Packet Send Status: ");
//Serial.println(status == ESP_NOW_SEND_SUCCESS ? "Delivery Success" : "Delivery Fail");
}
// Function to broadcast data
void broadcast(const String &message) {
uint8_t broadcastAddress[] = {0xFF, 0xFF, 0xFF, 0xFF, 0xFF, 0xFF};
esp_now_peer_info_t peerInfo = {};
memcpy(&peerInfo.peer_addr, broadcastAddress, 6);
if (!esp_now_is_peer_exist(broadcastAddress)) {
esp_now_add_peer(&peerInfo);
}
esp_err_t result = esp_now_send(broadcastAddress, (const uint8_t *)message.c_str(), message.length());
if (result == ESP_OK) {
Serial.println("Broadcast message success");
} else {
Serial.println("Broadcast message failed");
}
}
void setup() {
Serial.begin(115200);
delay(1000);
WiFi.mode(WIFI_STA);
Serial.println("ESP-NOW Sender Setup");
Serial.print("MAC Address: ");
Serial.println(WiFi.macAddress());
WiFi.disconnect();
if (esp_now_init() == ESP_OK) {
Serial.println("ESP-NOW Init Success");
esp_now_register_send_cb(sentCallback);
} else {
Serial.println("ESP-NOW Init Failed");
delay(3000);
ESP.restart();
}
}
void loop() {
// Create the message string with analog values (simulated)
int analogChannel1 = analogRead(A0);
int analogChannel2 = analogRead(A1);
int analogChannel3 = analogRead(A2);
int analogChannel4 = analogRead(A3);
int analogChannel5 = analogRead(A4);
int analogChannel6 = analogRead(A5);
String message = "LW:" + String(analogChannel1) + ",LA:" + String(analogChannel2) + ",LL:" + String(analogChannel3) +
",RW:" + String(analogChannel4) + ",RA:" + String(analogChannel5) + ",RL:" + String(analogChannel6);
broadcast(message);
delay(1000); // Broadcast every 1 second (adjust as needed)
}
Wearable code:
For each wearable, channel_ID is changed and flashed respectively
//Generic example code
//Wearable Receiver Code
//XIAO ESP32-C3
#include <WiFi.h>
#include <esp_now.h>
// Specify the channel this device is responsible for
#define CHANNEL_ID 'LW' // Change this to 'LA', 'LL', 'RW', 'RA', or 'RL' depending on the receiver
// Function to format MAC address
void formatMacAddress(const uint8_t *macAddr, char *buffer, int maxLength) {
snprintf(buffer, maxLength, "%02x:%02x:%02x:%02x:%02x:%02x", macAddr[0], macAddr[1], macAddr[2], macAddr[3], macAddr[4], macAddr[5]);
}
// Callback when data is received
void receiveCallback(const esp_now_recv_info *recvInfo, const uint8_t *data, int dataLen) {
char buffer[ESP_NOW_MAX_DATA_LEN + 1];
int msgLen = min(ESP_NOW_MAX_DATA_LEN, dataLen);
strncpy(buffer, (const char *)data, msgLen);
buffer[msgLen] = 0; // Ensure the string is null-terminated
// Format the MAC address
char macStr[18];
formatMacAddress(recvInfo->src_addr, macStr, 18);
Serial.printf("Received message from: %s - %s\n", macStr, buffer);
// Parse the message for the specific channel
char searchStr[3] = {CHANNEL_ID, ':', '\0'};
char *found = strstr(buffer, searchStr);
if (found) {
int value = atoi(found + 2); // Get the value after the "X:"
//Serial.printf("Channel %c value: %d\n", CHANNEL_ID, value);
Serial.print(CHANNEL_ID);Serial.print(":");Serial.println(value);
int pwmval = map(value, 0, 4095, 0, 255);
analogWrite(10, pwmval);analogWrite(3, pwmval);
}
}
void setup() {
Serial.begin(115200);
delay(1000);
pinMode(10, OUTPUT);
pinMode(3, OUTPUT);
digitalWrite(10, LOW);
digitalWrite(3, LOW);
WiFi.mode(WIFI_STA);
Serial.println("ESP-NOW Receiver Setup");
Serial.print("MAC Address: ");
Serial.println(WiFi.macAddress());
WiFi.disconnect();
if (esp_now_init() == ESP_OK) {
Serial.println("ESP-NOW Init Success");
esp_now_register_recv_cb(receiveCallback);
} else {
Serial.println("ESP-NOW Init Failed");
delay(3000);
ESP.restart();
}
}
void loop() {
// Nothing to do here, just wait for data
}
SOS alert notification mode
It is activated with a push button connected to XIAO ESP32-S3 module, if the button is pressed for more than 6 seconds, ESP32 S3 MCU starts communicating to Notecarrier. Notecard sends the alerts to the registered mobile over SMS through Twilio SMS Service - Guide.
for instance, if a device needs to operate in a region that is not currently covered then guide to use external SIM cards
Object tracking mode
Wearables can be attached or tagged to chairs, table, water bottle, windows etc., each wearable device has a buzzer on it. Each wearable could be programmed to different sounds and attached to things.
With the press of a corresponding button in central base device worn by VI users, which activates the sound until the button is kept pressed. This allows the user to quickly locate or become spatially aware of their surroundings through audio cues.
Conclusion and Future DevelopmentThis project introduces a proof-of-concept prototype for a simple, multi-use wearable haptic platform. However, further development is necessary to enhance its robustness and expand its features.
All the modes can be combined together and switched in the software with an external select switch.
I aim to continue refining this concept by integrating a computer vision-based solution as proposed both offline and Realtime. If you're interested, feel free to DM me, we can collaborate on this open-source project.
Thank You!
Comments