EmoSense Uses AI to Determine Your Emotional State Based Only on How You Affect Wireless Signals

Researchers have turned to artificial intelligence to build this system that can determine your emotional state using only wireless signals.

Humans are naturally social animals, and we have evolved to have an innate awareness of the emotions of people around us. But though you might easily be able to recognize that your coworker is feeling blue, it can be difficult to determine exactly how you know that. Is it their slightly slouched posture? Or maybe it’s that they’re less talkative than normal? Because we have a hard time identifying subtle emotional indicators, it’s a challenge to program a system to look for them. That’s why researchers have turned to artificial intelligence to build this system that can determine your emotional state using only wireless signals.

It might sound impossible for wireless signals to be effected by your emotional state, but this research team from universities in Japan and China says EmoSense is as reliable as other sensor or computer vision-based emotion recognition techniques. The system works thanks to a few simple facts: your body and movements affect the wireless signals around you. The changes in those wireless signals can be tracked in order to recognize movements. And, most importantly, your body movements are dependent on your emotional state—even if you’re not consciously aware of that.

To take advantage of those facts, the researchers setup four antennas around the subject. One antenna transmits wireless signals, while the other three antennas receive the signals. A small computer analyzes the received signals, and can detect the “shadow” of the subject. That shadow subtly changes as they move. Those movements are then analyzed by a machine learning model that has been trained on the shadow data and what emotions different shadows correspond to. Like any other machine learning system, EmoSense can then make an inference based on the data it was trained on.

In this case, it’s making an educated guess about the emotional state of the test subject. For example, an angry person might move with sharper gestures. If the system had the right training data, it could infer that a person who is currently moving with jerkiness may be angry. The hardware needed for EmoSense is less expensive than other systems, and still manages to be as reliable—at least according to the researchers. It also carries less potential for privacy violations. That said, psychology is a soft science, and body movements can vary wildly between different individuals who are experiencing similar movements. That limits the practicality of EmoSense, but that might still be a challenge could be overcome.

Cameron Coward
Writer for Hackster News. Proud husband and dog dad. Maker and serial hobbyist.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles