Future VR Interaction Could Be Powered by Facial Recognition, as Researchers Demo Smile-to-Move Tech

Smile-to-move and clench-to-activate demonstrated as immersive interaction elements in simple VR scenarios.

Future virtual reality interactions could be controlled with your face, rather than your hands, following research carried out on tracking facial expressions and using them to control the user's actions in VR.

"A smile was used to trigger the 'move' command; a frown for the 'stop' command; and a clench for the 'action' command, in place of a handheld controller performing these actions," explains Mark Billinghurst, professor at the University of South Australia, of the project.

"Essentially we are capturing common facial expressions such as anger, happiness, and surprise and implementing them in a virtual reality environment."

The input system used in the experiment doesn't track the user's expressions directly through a camera, but infers them from an electroencephalograph (EEG) headset — tracking brain activity associated with each of the three trigger expressions

The team tested the facial-control system in three scenarios: A "happy" scenario, in which the user was asked to catch butterflies; a "neutral" scenario of picking up objects in a workshop; and a "scary" scenario in which they had to shoot zombies. In all three scenarios, the trigger facial expressions were kept the same.

"Overall, we noticed an effect of interaction methods on the gamma activities in the brain and on skin conductance," the team concludes. "For some aspects of presence, facial expression outperformed controllers but controllers were found to be better than facial expressions in terms of usability."

"We expected the handheld controllers to perform better as they are a more intuitive method than facial expressions," says Billinghurst. "However, people reported feeling more immersed in the VR experiences controlled by facial expressions. Hopefully with some more research we can make it more user friendly."

The team is looking to expand its work to allow those unable to use a traditional controller to interact hands-free in virtual reality, and to implement it as a layer on top of — rather than replacing — traditional handheld controllers, where triggering actions on a facial expression would make more sense.

The team's work has been published under closed-access terms in the International Journal of Human-Computer Studies.

Gareth Halfacree
Freelance journalist, technical author, hacker, tinkerer, erstwhile sysadmin. For hire: freelance@halfacree.co.uk.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles