Helping the Blind See with a LIDAR, iPad, and Teensy 3.6 Tactile Interface

This project allows for sight-impaired individuals to get physical feedback based on a LIDAR unit and an iPad, letting them avoid obstacles.

The Challenge

For people who are blind, navigating a room can be quite difficult. Avoiding bumping into objects is normally done with some kind of cane, but what if someone wanted to "see" further ahead? YouTuber Shane Wighton (AKA Stuff Made Here) — who you may recall from his robotic basketball hoop or auto-adjusting golf club — has done just that by creating a system that utilizes an iPad's LIDAR capabilities and a custom-built tactile interface that alerts users to obstacles nearby. As a bonus, the iPad app also converts the LIDAR map into an augmented reality overlay on the real world, giving others a look at the processed information.

How LIDAR Works

LIDAR stands for Light Detecting and Ranging, which operates in a similar manner to radar — bouncing electromagnetic waves off of a surface and waiting for them to return to the point of origin. However, while radar uses invisible microwaves to find objects, LIDAR employs visible light in the form of an often spinning laser or array of laser emitter and detector pairs. By sending out a pulse of light at a target and timing how long it takes to detect an reflection, extremely precise distances can be measured. When this principle is applied to a matrix of individual points, detailed maps can be generated.

Processing and Displaying the Data

The iPad is set up to take readings at regular intervals and then use that data to create a distance map. This looks a lot like a heat map, where objects that are closer to the user have a more red hue and ones that are further away are more green. By color coding, external observers can quickly tell what the system is seeing.

Physical Feedback

To alert the user of a potential obstruction in their way, the app is able to send information to a Teensy 3.6 that pushes a series of pins one of three levels into their hand that correlates to where an impediment might be. A wearer wraps their hand around a metal cylinder that has a grid of pins. Each row on the device is assigned to a single finger. As seen in the following illustration, the pins press against the hand to draw a mental map. Pins that aren't pushed in indicate a clear area, a half-pushed pin means that an something is nearby, and a fully-pushed pin means to avoid that area.

The mechanism to drive the pins is quite ingenious. Two opposing stepper motors work together to either rotate the central cylinder or move it up and down. This allows the pin map to have a small form factor.

Augmented Reality App

As mentioned in the video, Apple's SDK for integrating its LIDAR system into an app makes development fairly simple. There are a few buttons on the right side that give extra controls to the user, including zoom and camera selection. Along with a depth-data preview window, the app overlays a mesh surface on top of objects to show how exactly they're being tracked.

Using the Device

To test it, Wighton arranged a series of barriers in a room with varying heights and widths. All in all, it worked decently well for larger objects, but the system struggled to "display" a very high resolution map on the cylinder simply because there weren't many pins. Some things Wighton wants to change include more granularity in pin pressure, a greater number of pins, and a different drive mechanism. Overall, this project gives sight-impaired people a whole new way to view the world around them while still being low cost.

Evan Rust
IoT, web, and embedded systems enthusiast. Contact me for product reviews or custom project requests.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles