This project demonstrates a fully autonomous robot built with zero onboard sensors.
No ultrasonic sensors.No IR modules.No encoders.No IMU.
Instead of relying on hardware-heavy sensing, the robot uses computer vision powered by Python and OpenCV. An overhead camera tracks an ArUco marker mounted on the robot and calculates its real-time position and orientation.
The ESP32 only receives motor speed commands over WiFi and executes them. All intelligence runs externally.
It’s a clean example of how software can replace hardware complexity.
💡 Why Did I Decide to Make It?Most beginner robotics projects follow the same path:
Add more sensors
- Add more sensors
Add more wiring
- Add more wiring
Add more calibration
- Add more calibration
Deal with sensor noise and drift
- Deal with sensor noise and drift
I wanted to challenge that approach.
The goal was simple:
Can I build an autonomous robot without adding any onboard sensing hardware?Can I build an autonomous robot without adding any onboard sensing hardware?
This project proves that in controlled environments, external computer vision can completely replace traditional sensor stacks — while reducing cost and complexity.
It’s also a powerful learning platform for:
Vision-based robotics
- Vision-based robotics
Differential drive control
- Differential drive control
WiFi-based command systems
- WiFi-based command systems
Path planning fundamentals
- Path planning fundamentals
How Does It Work?
The system is divided into two main parts:
1️ Vision & Control (Computer Side)An overhead camera captures the workspace.
- An overhead camera captures the workspace.
OpenCV detects the ArUco marker attached to the robot.
- OpenCV detects the ArUco marker attached to the robot.
The marker provides:
X position
- X position
Y position
- Y position
Orientation angle
- Orientation angle
- The marker provides:X positionY positionOrientation angle
A user draws a path on the screen.
- A user draws a path on the screen.
The system calculates heading error and distance error.
- The system calculates heading error and distance error.
Motor speed crrections are computed in real time.
- Motor speed corrections are computed in real time.
ESP32 connects to WiFi.
- ESP32 connects to WiFi.
Receives motor speed commands via HTTP.
- Receives motor speed commands via HTTP.
Controls motors using PWM through a TB6612 driver.
- Controls motors using PWM through a TB6612 driver.
Executes differential drive motion.
- Executes differential drive motion.
The robot itself has no awareness — it simply follows commands generated by the vision system.
⚙Hardware UsedESP32 Dev Board
- ESP32 Dev Board
TB6612FNG Motor Driver
- TB6612FNG Motor Driver
2 DC Gear Motor
- 2 DC Gear Motors
Robot Chassis
- Robot Chassis
Printed ArUco Marker
- Printed ArUco Marker
Overhead Webcam
- Overhead Webcam
Total cost: Budget-friendly (2000 INR or 20$)
🎥 DemonstrationExample section:
Watch the robot follow a drawn path autonomously
🎯 Key Takeaways✔ Zero onboard sensors✔ Vision-based localization✔ Low hardware complexity✔ Clean software architecture✔ Easily extendable to multi-robot systems
I AM VIKAS I LOVE ROBOTICS❤️✨🍀















_3u05Tpwasz.png?auto=compress%2Cformat&w=40&h=40&fit=fillmax&bg=fff&dpr=2)
Comments