Sensory Overload

Researchers created the first artificial, multisensory integrated neuron, which combines visual and tactile stimuli for better AI systems.

nickbild
over 2 years ago Sensors
This chip integrates tactile and visual sensing for smarter AI systems (📷: Tyler Henderson / Penn State)

The human senses work together synergistically to provide us with a comprehensive and rich understanding of the world around us. Sight, hearing, touch, taste, and smell each offer a unique perspective on our environment. But it is the integration of this sensory information that allows us to navigate the world so effectively, make sense of our surroundings, and make decisions based on a multifaceted understanding.

Consider a pianist, whose artistry relies on the seamless collaboration of multiple senses. As their nimble fingers caress the keys, the sense of touch provides vital feedback, allowing for precise control over nuances of tempo and expression. Simultaneously, their sense of hearing captures the resonating notes, allowing for immediate adjustments to pitch and volume. In the midst of this sensory symphony, the sense of sight plays a critical role, as the musician translates visual notations on sheet music into graceful and coordinated finger movements. This elegant fusion of touch, hearing, and sight is an example of the extraordinary synergy of our sensory system.

In contrast, electronic sensors used by artificial intelligence systems often operate independently of one another. These sensors, such as cameras, microphones, and various environmental sensors, collect data in isolation. While this can provide a wealth of information, it lacks the synergy seen in human senses. For example, a security camera can capture visuals, but it cannot directly incorporate audio data or tactile feedback. This disjointed approach can limit the system's ability to make nuanced decisions or provide a holistic understanding of complex situations.

But a team of researchers at Penn State University believe that the future will be filled with intelligent machines that perceive the world in a more human-like way. Towards that goal, they have created the first artificial, multisensory integrated neuron. Their chip combines the sensing of both visual and tactile stimuli in a way that allows one the inputs of each sensor to modify the behavior of the other.

An overview of the technology (📷: M. Sadaf et al.)

The researchers’ device is composed of a photosensitive molybdenum disulfide memtransistor layer, and a triboelectric tactile sensor. The memtransistor can sense light that is shined on it, and importantly, it also has a memory. Much like a person in a dark room where the light switch is flipped on for just a moment, the memtransistor can remember what it saw for a time. The tactile sensor consists of two layers. As the layers slide past one another, the friction is used to generate an electrical signal that represents the sensation of touch.

A specialized circuit was developed to generate digital spiking events from the combination of the visual and tactile information. The chance of this circuit spiking was determined by the strength of the sensory inputs. Interestingly, the team found that when both the visual and tactile input signals were very low, the spiking output of the chip was large. They noted that this correlated well with biological responses to simulation, where multiple sensory cues are combined when signals are weak.

But why all the hassle of creating special hardware like this? Why not just have independent sensors and let the machine learning model figure out the complex relationships that exist between them? That is the dominant approach today, but the researchers explained that it is not the most efficient path forward. For robots, drones, and self-driving vehicles to be widely adopted, we will need a more efficient and eco-friendly approach, like the artificial, multisensory integrated neuron proposed here.

So far, the team has only focused on integrating vision with touch, but moving forward, they plan to develop new chips that integrate even more senses. The hope is that this will allow for the development of far more intelligent and efficient systems in the future.

nickbild

R&D, creativity, and building the next big thing you never knew you wanted are my specialties.

Latest Articles