Eye Imaging Technology Tapped to Help Vehicles See Better

Duke engineers leveraged eye imaging technologies to develop a system that improves the vision capabilities of autonomous cars and robots.

Cabe Atwell
4 years ago β€’ Sensors / Environmental Sensing
Engineers are ditching LiDAR in favor of FMCW (Frequency-Modulated Continuous Wave) LiDAR, which shares similarities with OCT (Optical Coherence Tomography) technology. (πŸ“·: Duke Engineering)

Engineers from Duke University have taken to eye imaging technologies, typically found in an optometrist's office, to develop a system that allows autonomous vehicles and robots to "see" better. One of the more widely used imaging technologies utilized for self-driving cars is LiDAR, which maps objects using laser light rather than radio waves employed by RADAR systems. While LiDAR does its job well, it has some drawbacks in 3D vision applications, including reduced functionality in ambient sunlight, limited depth resolutions, and the increased time it takes to map large areas.

To overcome those issues, the team turned to another form of LiDAR known as FMCW (Frequency-Modulated Continuous Wave) LiDAR, which shares the same functionality principle as OCT (Optical Coherence Tomography). OCT is the optical equivalent of ultrasound, which works by sending sound waves into objects and measuring how long they take to come back. To time the light waves return, OCT devices measure how much their phase has shifted compared to identical light waves that have traveled the same distance but have not interacted with another object.

FMCW LiDAR offers a similar approach, albeit with a few tweaks. The technology functions by sending out a laser beam that continually shifts between different frequencies. When the detector receives the light to measure its reflection time, it can distinguish between the specific frequency pattern and any other light source, enabling it to work in different conditions at a very high speed. It then measures any phase shift against unimpeded beams, thus making it more accurate in determining distance over current LiDAR systems.

The researchers also removed the rotating mirrors found in standard LiDAR systems that scan the laser over landscapes in favor of a diffraction grating that functions like a prism that breaks the laser into a rainbow of frequencies that spread out as they travel. Because the laser is still sweeping through those frequencies, it translates into a much faster beam than those that use mirrors. The result is an FMCW LiDAR system that provides a submillimeter localization accuracy with data throughput 25x greater than traditional LiDAR. In fact, the system is fast and accurate enough to capture the details of moving human body parts, such as a nodding head or making a fist, in real-time.

Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles