Researchers Aim to Boost VR, AR Accessibility — By Letting Users Navigate with Their Faces
InterFACE picks seven "Facial Action Units" to control games and even web browsing, hands-free.
Researchers from the University of Glasgow and the University of St. Gallen have come up with a way to make virtual and augmented reality (VR and AR) more accessible — by reading the user's facial expressions as a means of delivering hands-free control, using a readily-available off-the-shelf VR headset.
"Some have suggested that VR is an inherently ableist technology because it often requires users to perform dextrous hand motions with controllers or make full-body movements which not everyone can perform comfortably. That means that, at the moment, there are barriers preventing widespread access to VR and AR experiences," co-author Graham Wilson explains. "With this study, we were keen to explore whether the functions of a commercially-available VR headset could be adapted to help users accurately control software using only their face. The results are encouraging, and could help point to new ways of using technology not only for people living with disabilities but more widely too."
The team's work focused on a commercial off-the-shelf headset, the Meta Quest Pro, and did not modify the hardware in any way. Instead, it relied on the existing on-board cameras to monitor the wearer's facial expressions — a feature Meta already offers, with a list of 53 recognized expressions, as a way of increasing immersion in multiplayer environments. The team, though, set out to identify which of these expressions would be comfortable for repeated use, then built a custom neural network to read them with a 97 per cent accuracy.
Once the researchers had identified seven "Facial Action Units" that were comfortable for repeated use and trained the network to sufficient accuracy, participants were asked to use the expressions for navigation: controlling a first-person shooter, including aiming, selecting options, and firing weapons; and using a web page through an automated environment. While the participants, none of whom were disabled, rated the experience as less precise than using hand-drive controllers, they also reported that the facial control system worked well and did not require excessive effort.
"This is a relatively small study, based on data captured with the help of non-disabled people. However, it shows clearly that these seven specific facial movements are likely the most easily-recognized by off-the-shelf hardware, and translate well to two of the typical ways we might expect to use more traditional VR and AR input methods," claims co-author Mark McGill. "That gives us a base to build on in future research. We plan to work with people with disabilities like motor impairments or muscular disorders in further studies to provide developers and XR platforms with new suggestions of how they can expand their palette of accessibile."
The team is to present its work, dubbed InterFACE, at the CHI Conference 2025 on Monday, April 28th, with a preprint available on the University of St. Gallen website now; the researchers have also promised to make the training dataset available for others to explore.