IMU Used to Improve Touch Accuracy and Latency in Mixed Reality Environments

This system combines a finger-mounted IMU with optical input from an HMD to improve surface tapping input accuracy and latency.

Jeremy Cook
4 years ago

Consider that head-mounted mixed reality (MR) systems have the inherent ability to sense touch on any surface. This can therefore make any device into a “touchscreen,” by registering contact if a finger gets to within 10mm of a surface. While useful, a distance under 10mm doesn’t always mean that a human finger has touched the surface, leading to unreliable sensing.

Researchers at Tsinghua University, Beijing, however, have been exploring a way to combine accelerometer sensing with visual data in order to improve accuracy — from a rather lackluster 84.74% using optical sensing alone, to a much improved 98.61%. Latency for this hybrid system is also excellent, with readings taking place in as little as 10ms.

In their experiments, researchers first investigated the proper ring placement to figure out the proximal phalanx, or “root” of the index finger to be the optimal position for sensing. Participants in the experiment then wore a nine-axis IMU ring wired to an Arduino Uno, and tapped on a wooden board covered with conductive ink while wearing a Leap Motion optical sensor.

Data from this ring, Leap Motion device, and board capacitance readings were recorded and analyzed to the setup's performance. While results are promising so far, the project’s paper notes that given the amount of accelerometer data available, machine learning techniques could improve accuracy even further.

Jeremy Cook
Engineer, maker of random contraptions, love learning about tech. Write for various publications, including Hackster!
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles