Real-Time 3D Mapping for $30

You can add a 3D scanner to your projects for $30 with just an ESP32, a time-of-flight sensor, and an IMU.

Nick Bild
19 hours agoSensors
Generating a 3D map of a room with $30 in hardware (📷: Henrique Ferrolho)

Generating a 3D map of a scene is a prerequisite to all sorts of autonomous navigation and augmented reality applications, as well as the development of advanced robotics systems. This is not an inexpensive task, as it typically requires a sophisticated suite of sensors and high-performance computing to process vast amounts of spatial data in real time.

We recently reported on a Raspberry Pi-based LiDAR scanner that could be built for well under $1,000 (if you can find the right deals on eBay), which is a bargain for this type of system. However, that price tag is still pretty hefty for the casual experimenter. Fortunately, Henrique Ferrolho has now brought us a more affordable solution. It may not be the most precise 3D mapper around, but you can make your own copy of it for about $30.

At the core of the project is STMicroelectronics’ VL53L5CX time-of-flight sensor. Unlike the more familiar single-beam distance modules, this tiny chip measures depth across an 8×8 grid, producing 64 simultaneous distance readings. With a diagonal field of view of about 65 degrees and a maximum range near four meters, it effectively sprays invisible infrared measurement pulses into the room and records how long they take to bounce back. The result is a low-resolution depth image captured roughly 15 times per second.

Ferrolho pairs the sensor with a BNO085 inertial measurement unit, which tracks the orientation of the device in space. That orientation data is crucial: the system doesn’t just measure distances, it also knows which direction it was pointing when each measurement was taken. Both sensors connect to an ESP32 development board over a shared I2C bus, and the microcontroller streams the readings as JSON over a serial link to a computer.

On the PC side, a Python viewer reconstructs the environment in real time. Opening a local web page reveals a rotating 3D visualization where each of the 64 zones appears as a ray extending into space. As you move the device around, the virtual scene moves with it, thanks to quaternion orientation data from the IMU.

Because inexpensive time-of-flight sensors can be noisy, the software applies temporal smoothing using an exponential moving average and attempts surface detection using least-squares and RANSAC plane-fitting algorithms. A mapping mode accumulates measurements over time, gradually building a recognizable model of walls, floors, and furniture.

It’s not LiDAR, and it won’t rival professional scanners, but that’s not the goal. By combining a multizone depth sensor, an IMU, and a few clever algorithms, this project demonstrates that practical 3D spatial sensing is now accessible to hobbyists and robotics developers alike. If you’d like to try it out for yourself, the viewer software is available on GitHub.

Nick Bild
R&D, creativity, and building the next big thing you never knew you wanted are my specialties.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles