Keep Your AIs on the Road

Research done by NVIDIA has revealed a set of hand gestures that are optimized for safety, ease of use, and robustness in automotive UIs.

Nick Bild
2 years agoMachine Learning & AI
Collecting data in a driving simulator (📷: S. Gupta et al.)

For as long as there have been automobiles, there has been a standard set of generally accepted hand gestures in use by motorists. In the past these gestures were used only as an expression of dissatisfaction with another driver’s skill behind the wheel, but in more modern times hand gestures can serve a much more important purpose. Traditional automobile interfaces that require touch-based inputs have a tendency to take a driver’s eyes off of the road. This is a serious problem — in 2013 driver distraction played a role in 10% of all crashes reported to the police in the United States according to the NHTSA. By using a touchless, hand gesture-based interface (with the hands being on, or close to, the steering wheel) instead, driver distraction can be reduced, which enhances safety.

This gesture recognition capability has been made possible by advances in both machine learning algorithms and edge computing hardware platforms. But all hand gestures are not created equal — even state-of-the-art computer vision-based algorithms have difficulty recognizing some gestures. Driving down the highway at 70 miles per hour is not the time or place to debug a problem with your car, so these gesture-based interfaces need to be very highly accurate, and consistently so. If not, they would only further contribute to distraction and dangerous driving practices.

A number of studies have been conducted to assess user acceptance of hand gesture-based automobile controls. It has been found that drivers generally prefer these interfaces over traditional options, but that they also find them to be less reliable. This lack of reliability is an important factor that has greatly limited deployment of gesture recognition systems in presently available vehicles. Fortunately, we may see some movement in this area in the near future because the machine learning wizards at NVIDIA have taken on the challenge. As a first step, they have evaluated a set of 25 hand gestures to determine the best way to optimize for safety, ease of use, and robustness.

Each of the 25 gestures was evaluated by five state-of-the-art gesture recognition algorithms, and also by six humans. Inputs for the algorithms were acquired with RGB cameras. The team found that a certain set of gestures were consistently recognized more accurately by both algorithms and humans. This suggests that some gestures are simply inherently more recognizable than others.

By examining which gestures were frequently confused with other gestures, the team was able to create a new set of 10 merged gestures. For example, rotating the fingers clockwise, and rotating the fingers counterclockwise were often confused with one another. So a new, merged gesture was created that just involves rotating the fingers in either direction. By merging similar gestures into one, misclassifications are avoided, and system robustness increases significantly.

In addition to being easier to detect, it turned out that the merged gestures were also more memorable for humans. Using recognition as a proxy for memorability, it was observed that the merged gestures were more memorable than the initial set of 25. Recognition accuracy rates were 96.62% and 92.73%, respectively.

In this first-of-its-kind work, the team at NVIDIA has been able to determine a set of optimized hand gestures that could make driving safer, and finally lead to widespread adoption of this recognition technology that drivers have expressed an interest in. The work is not finished yet, however. Next up they plan to incorporate more human factors, like driver distraction and ease of use into their evaluation criteria. They also intend to use an even larger set of gesture data to increase their confidence in the results, and maybe even find some new gestures that would make automotive user interfaces even better.

Nick Bild
R&D, creativity, and building the next big thing you never knew you wanted are my specialties.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles