Based on a Raspberry Pi, the bot will drive up to you, detect how you’re feeling and then try to hold a conversation with you depending on your mood. Empathybot was also brought to life using Google Cloud Vision, a GoPiGo, a Raspberry Pi camera, a button, and speaker.
Here’s how it works: When it detects a person with the ultrasonic sensor, it stops. The sensor measures distance from the GoPiGo to the human subject. Once it’s close enough for a good look, it greets the person, and takes a photo. Using Google Cloud Vision software and the Raspberry Pi, Empathybot analyzes the face for emotions looking for happiness, sadness, surprise, or anger.
From here, the robot sends the picture to Google Cloud Vision where it’s analyzed, and Google returns a text document (in JSON) of what emotions it sees in the picture. Depending on what the software detects, the robot will ask you why you’re so happy, or sad, or if you’re angry, remind you that it has a family.
Dexter Industries also employed software called eSpeak to help Empathybot talk back to you through the speaker. After the emotional interaction, the bot drives off to find another friend. The code is written in Python.
Sound pretty intriguing? You can find the team’s entire writeup and tutorial here.