EyeSyn Simulates Human Eye Movements to Boost the Metaverse

The virtual platform simulates how humans look at the world, allowing companies to better train AR/VR programs.

Cabe Atwell
2 years agoWearables / Sensors
Overview of EyeSyn. (📷: Guohao Lan et al.)

Computer engineers at Duke University and TU Delft have developed virtual eyes capable of simulating how humans look at the world that is accurate enough for companies to train virtual and augmented reality applications. Known as EyeSyn, the virtual platform replicates how human eyes track stimuli, which developers can utilize to build the metaverse. That stimuli can be anything — engaging in conversations, viewing paintings in art galleries, or purchasing products online.

“If you’re interested in detecting whether a person is reading a comic book or advanced literature by looking at their eyes alone, you can do that,” states Maria Gorlatova, professor of electrical and computer engineering at Duke. “But training that kind of algorithm requires data from hundreds of people wearing headsets for hours at a time. We wanted to develop software that not only reduces the privacy concerns that come with gathering this sort of data but also allows smaller companies who don’t have those levels of resources to get into the metaverse game.”

Instead of employing human eyes, the team created a virtual set by feeding the system templates for typical eye movement patterns, such as reading texts, watching videos and engaging people in conversation. EyeSyn then learns to recognize and match those patterns and uses that data to guess what humans are doing or seeing. According to the engineers, the process removes some of the privacy concerns associated with capturing large amounts of biometric data for training algorithms.

Since the EyeSyn system relies on templates instead of large, cloud-based datasets filled with human eye movements, intrusion into privacy is minimal. This also makes the platform less resource-intensive, allowing smaller developers to tap into those resources and render virtual environments without large amounts of computing power.

Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles