This DIY Rig Lets You See Through Walls
See how Basically Homeless used mmWave radar and UWB to build real-life AR "X-ray vision" that tracks people through walls in real time.
Have you ever wished that you could have superpowers? Maybe you’d like to have incredible strength like The Hulk, or X-ray vision like Superman? Unless you want to spend your entire life training at the gym, you might have to roll the dice with gamma radiation (not recommended outside of comic books) to achieve super strength. X-ray vision, on the other hand, may be achievable — if you don’t mind using your phone to help.
In a recent video, Basically Homeless shows us how we can unlock this superpower for ourselves. Fortunately, you don’t need to be born on an alien world or receive a massive dose of radiation. All you’ll need is the right sensors and an understanding of how they work.
This futuristic system is designed around inexpensive millimeter wave (mmWave) radar sensors — tiny devices capable of detecting human presence and subtle movements like breathing through walls. By placing a network of these sensors around a home, the creator effectively built a distributed awareness system capable of tracking people in multiple rooms simultaneously.
Each sensor is connected to a small ESP32 development board, which transmits detection data wirelessly to a central processing unit. This hub — running on a Raspberry Pi or laptop — collects and combines all incoming signals into a unified coordinate system. However, knowing where others are isn’t enough; the system also needs to know where you are.
To solve this, the build incorporates ultra-wideband (UWB) anchors positioned throughout the house. These anchors triangulate the user’s exact location with high precision, functioning like a personal indoor GPS. By wearing a UWB tag, the system continuously tracks the user’s position relative to detected targets.
Mapping the environment was another important piece of the system. Instead of manually measuring rooms, the creator used a robotic vacuum equipped with LiDAR to generate a highly accurate floor plan. This digital map ensures that all sensor data aligns correctly with real-world locations, forming the foundation for the augmented reality overlay.
The final piece of the puzzle is the display: an Android smartphone. Using its camera and orientation sensors, the phone overlays bounding boxes onto live video, indicating where people are located — even behind walls.
While it may not be true X-ray vision, this project demonstrates how accessible technology is rapidly blurring the line between science fiction and reality. Check out the video below to see the system in action.
R&D, creativity, and building the next big thing you never knew you wanted are my specialties.