It Was a Nice Gesture, Anyway

JINSense repurposes a pair of smart glasses to detect midair gestures — but you'll have to take the glasses off first.

Nick Bild
5 years agoAI & Machine Learning
JINSense (📷: H. Yeo et al.)

Sometimes you do not need shiny new hardware to build the device that you have dreamed up. You might find that it already exists if you get a little bit creative with how you use it. A multidisciplinary team centered at the University of St. Andrews found that to be the case when they wanted to develop a method for gesture sensing.

They started with a pair of JINS MEME smart eyeglasses, which are equipped with electrooculography (EOG) sensors embedded into the nose pad and nose bridge. These sensors are designed to track eye movements and blinking to support a healthy lifestyle. The EOG sensors were repurposed in this study to recognize midair gestures with a technique referred to as JINSense.

Our eyes naturally act as electric dipoles, with the cornea being the positive pole, and the retina being the negative pole. When moving our eyes, this dipole, and the electrical potential, moves, which can be measured by EOG sensors in the vicinity of the eyes. When such sensors are not in direct contact with the skin, they measure disturbances in the electric potential difference in the surrounding air. It is this latter property that makes JINS MEME glasses both appropriate, and a bit awkward, as a platform for JINSense — you have to take the glasses off for gesture sensing to work. As long as the sensors are making skin contact, they will be overwhelmed with unrelated signals.

For gesture sensing, the glasses are removed and set on a surface such as a table. The team initially tried a simple thresholding approach to classify gestures, but the accuracy was unacceptably low. They next turned to the Python module scikit-learn to build a machine learning model. Chosen for the speed and tiny memory footprint, a Random Decision Forest discriminates between gestures. After training the model, they experimented with it and found that it seemed to be able to robustly recognize left and right gestures. It could not tell the difference between up and down, but could recognize them as a class distinct from left and right. They believe that push, pull, rotate, and wiggle gestures may be able to be accurately classified with some further refinement of the machine learning methods.

While these gestures belong to broad categories and do not have great specificity, they are still sufficient for basic interactions with devices. There have also been indications that JINSense may have applications in proximity sensing and materials differentiation. They were able to determine if plastic bottles were filled with water or not, for example. JINSense is still in the very early prototype stages, and large scale validations of the methods have not yet been conducted.

The team is currently exploring avenues to allow their device to be used as a gesture recognizer while the glasses are being worn. That would go a long way towards making JINSense a practical device. In the meantime they are looking for more compelling use cases for their sensing technology — without further refinement, JINSense will continue to be a solution searching for a problem.

Nick Bild
R&D, creativity, and building the next big thing you never knew you wanted are my specialties.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles