A Different Way of Looking at the World

The Tianmouc vision chip was inspired by the human visual system and can process 10,000 frames per second, yet is highly energy-efficient.

Nick Bild
2 months ago β€’ Sensors
Tianmouc is a novel vision chip inspired by the brain (πŸ“·: Tsinghua University)

For self-driving vehicles, autonomous drones, and intelligent robots to accomplish their goals, they first need to capture detailed and accurate information about the world around them. One of the best ways to do that is through the use of image sensors. These sensors provide a wealth of information that can be leveraged for navigation, obstacle avoidance, and just about anything else that a particular application calls for.

There is a downside to the rich data that cameras provide, however. A high-resolution image sensor captures millions of data points, perhaps 60 times per second. Processing this data quickly enough to adapt to rapidly changing conditions is very challenging and can drive up the cost, size, and power consumption of the supporting computing equipment. This factor also limits the number of image frames that can be processed per second, which further hinders efforts to respond to situations quickly. Furthermore, traditional image sensors suffer from limitations in their dynamic range, which might blind them when, for example, driving through a tunnel, or when shined by a headlight at night.

At their core, these problems all stem from the fact that image sensors operate almost nothing like the human eye. Taking steps to reproduce the way that the eye works could significantly improve present computer vision systems. That is exactly what a team of researchers at Tsinghua University is attempting to do with their specialized vision chip called Tianmouc. They have developed an entirely new sensing paradigm that greatly increases frame rates and dynamic range, while simultaneously reducing power consumption.

Inspired by the brain, Tianmouc first parses visual information into simpler, primitive-based representations. These primitives are then combined to give the system a complete understanding of its surroundings, but without requiring the dense, pixel-level data produced by traditional image sensors. Evaluations of Tianmouc demonstrated that it can achieve frame rates of up to 10,000 per second, even at a 10-bit level of precision. It can also deal with rapidly changing light levels, thanks to a very large dynamic range of 130 decibels.

The team demonstrated the capabilities of their chip in an autonomous driving system, where it proved to be accurate, fast, and robust. These qualities enabled it to handle many of the edge cases that trip up existing autonomous navigation systems. As such, Tianmouc-based vision systems might potentially lead to the development of faster responding and safer self-driving vehicles in the future.

Nick Bild
R&D, creativity, and building the next big thing you never knew you wanted are my specialties.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles