Visually impaired individuals face significant challenges in navigating independently. Trained guide dogs are a traditional solution, but they require 2+ years of specialized training and involve high costs for food, healthcare, and maintenance. Furthermore, dogs can be distracted, fall sick, or have a limited working lifespan. This makes guide dogs inaccessible to many people worldwide.
Proposed SolutionThis project introduces an autonomous robotic guide rover built using an ESP32 microcontroller. Unlike trained dogs, the rover requires no training period, works consistently without fatigue, and has lower long-term costs. Equipped with an HC-SR04 ultrasonic sensor and IR sensors, the rover detects obstacles in real time and chooses the safest direction to move. Additional modules like a servo motor, buzzer, and OLED display enhance usability and interaction.
The rover is built using an ESP32 Dev Module, IR sensors, Servo motor, Buzzer, L298N motor driver, and an HC-SR04 ultrasonic sensor to detect and avoid obstacles in its path. The rover intelligently navigates around objects by scanning both left and right sides, choosing the best direction based on real-time distance data. Trained guide dogs require significant time, effort, and resources to prepare for assisting visually impaired individuals. It can take up to two years of specialized training for a dog to reliably navigate complex environments, respond to commands, and handle unpredictable situations. Additionally, guide dogs come with ongoing costs for food, healthcare, and maintenance, which can be a financial burden over time. In contrast, a robotic guide system like this one offers several advantages: it requires no training period, can operate consistently without fatigue, and eliminates recurring expenses related to pet care. It also avoids issues like distraction, illness, or limited working lifespan that affect living animals, making it a more accessible and scalable solution in the long run.
How It Works- The ultrasonic sensor continuously measures the distance ahead.
- The IR sensors widen the detection range on the left, right, and rear.
- If the front is clear - the rover moves forward.
- If an obstacle is detected - the rover stops and checks left/right paths.
- Based on sensor input - it turns left, right, or makes a U-turn.
- A rear IR sensor ensures the person is not left behind (stops if the person is too far).
- The servo motor can perform actions like pulling a string.
- The buzzer gives audible alerts when danger is detected.
- The OLED display shows friendly feedback like “Hi!” or a smiley.
- Features
- Obstacle detection and avoidance.
- Real-time decision making.
- Expandable with Bluetooth, GPS module, AI camera.
- Low-cost components.
- Uses PWM motor control via L298N.
- The “brain” of the rover.
- Handles all logic, sensor data processing, and motor control.
- Provides Wi-Fi/Bluetooth for future upgrades like mobile app integration.
- Controls two DC motors independently.
- Receives signals from the ESP32 to move forward, reverse, left, or right.
- Provides PWM speed control.
- Measures distance to objects in front.
- Prevents collisions by triggering stop/avoidance maneuvers.
- Effective for 2 cm – 400 cm (ideal for obstacle avoidance).
- Left & Right: Detect close-range side obstacles that ultrasonic might miss.
- Rear: Detects whether the person being guided is close. If the person falls behind (>30 cm), the rover stops.
- Provides movement to the rover.
- Controlled through L298N driver and powered by an external battery.
- Can be used for pulling a string or moving attachments.
- Demonstrates expandability for assistive tasks.
- Gives audio feedback in case of danger or when the rover halts.
- Simple but effective alert system.
- Provides visual feedback like “Hi!” or a smiley face.
- Positioned at the head of the rover for a more engaging demo.
- Powers both the ESP32 and the motors via the L298N module.
- Ensures portability and independence from wired sources.
Arduino IDE Installation
- Install the latest version of Arduino IDE.
- Add the ESP32 Board Package via the Boards Manager (using the Espressif URL).
Library Dependencies
Adafruit_GFX
andAdafruit_SSD1306
→ For OLED display.ESP32Servo
→ For controlling the servo motor.Wire
→ For I2C communication with the OLED.
Code Upload
- Connect ESP32 via USB cable.
- Select Board: ESP32 Dev Module.
- Select the correct COM Port.
- Upload the provided code to the ESP32.
Testing
- Open Serial Monitor at 115200 baud to check sensor readings.
- Verify motors respond correctly to forward, left, and right commands.
- Confirm buzzer and OLED work as expected.
- Assistive navigation for visually impaired individuals.
- Educational robotics projects.
- Expo demonstration of autonomous navigation.
- Base platform for delivery or patrolling robots.
- Accessibility Gap: Millions of visually impaired individuals struggle with mobility. Guide dogs are expensive (up to $50,000 over their lifetime) and not accessible to everyone. This rover offers a low-cost, scalable alternative.
- Affordability: Built from off-the-shelf electronics, the cost is a fraction of training a guide dog, making it feasible for large-scale adoption.
- Consistency: Unlike animals, the rover doesn’t tire, get sick, or lose focus. It provides reliable assistance 24/7.
- Scalability: Units can be mass-produced and distributed to underserved regions, addressing the global shortage of guide dogs.
GPS Navigation: Adding a GPS module would allow the rover to follow predefined routes or assist with navigation in outdoor environments, helping the user reach specific destinations with greater autonomy.
AI-Based Camera Vision: Incorporating an AI-enabled camera would enable the system to recognize obstacles, traffic signs, and human gestures. This would allow smarter decision-making, obstacle avoidance, and even pedestrian signal detection for crossing roads safely.
Bluetooth Audio Output: A Bluetooth module can be used to transmit real-time audio commands to a wireless earpiece worn by the user. This allows the system to give clear verbal instructions or alerts (e.g., “Obstacle ahead, ” “Turn left, ” etc.) without relying on visual cues.
Mobile App Integration: A dedicated smartphone app can allow users or caregivers to remotely monitor the rover’s status, location, and battery level. It can also offer manual control features, speed adjustment etc. Caregivers can receive alerts in case of an emergency or if the rover stops unexpectedly.
We have used a Bambu Lab 3D Printer to 3D print the body of our rover. We first created a design containing various parts of the rover and assembled it all together to obtain the final build.
Comments