Researchers Develop Pipeline Software to Providing More Touchscreen Interactivity

This system allows conventional 2D capacitive touchscreens to infer a user’s 3D hand pose.

The new software can recognize a human hand’s pose, allowing a user to experience more interactivity with the touchscreen. (📷: Future Interfaces Group)

Touchscreen devices are capable of capturing the fingertips’ XY position on a screen. However, existing touchscreens cannot recognize 3D hand poses created by a human. Researchers at Carnegie Mellon's Future Interfaces Group recently developed open source pipeline software that enables capacitive touchscreens to capture 3D hand poses, improving a user’s interactivity with a screen. Overall, the software supports various hand poses and provides hand position and rotation tracking capabilities.

The researchers implemented their software in a Samsung Galaxy Tab S2 tablet, replacing the touch controller driver to obtain the raw capacitive image. The data acts similarly to a short-range depth camera. Afterward, the team cropped and normalized the input and compared it to a library of reference hand poses. Then, they computed the displacement field between all reference poses and the real-world input to determine the distance metric. They discovered that closer matches require less displacement while poor matches require more.

After choosing a pose, the team uploaded the corresponding 3D hand model. If a pose’s digits lack any fine-grain positioning, then the displacement field adjusts the 3D hand model’s fine grain. This entire process is performed by anchoring the hand model’s end effectors to the corresponding vector in the displacement field. Then, inverse kinematics produces the final, more realistic hand pose. Additionally, the researchers say their new approach doesn’t require any new sensor integration and can be installed on devices through software updates.

This new software pipeline can be utilized in many different applications. For example, it could be implemented in mobile AR, where a user-controlled virtual hand can manipulate objects in that environment. Also, adding an invisible virtual hand mesh below the touchscreen allows the user to interact with virtual objects. With six degrees of freedom, the mesh hand can be used to control a virtual brush.

Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles