Smart home devices like Amazon Echo or Google Home may be easy for most people to use. But for those suffering from motor neuron diseases such as amyotrophic laterals sclerosis (ALS), also known as Lou Gehrig’s disease, producing speech to command the system can present a seemingly insurmountable barrier.
However, there is now a workaround, as Jay Smith shows in the video below. Using Microsoft’s GazeSpeak app, he’s able to translate his eye movements into computer-generated speech. This speech is then picked up by Amazon’s Alexa to modify room conditions such as what lights are on, or what song is playing.
I have ALS so I am mostly paralyzed and use my eyes to communicate. I have a smart home and now can use my eyes to type commands through my Microsoft Surface. It works with my Wemo light, fans, Ecobee thermostat, and more.
In a related project, seen in the second video here, he even has a bed that can be controlled via his eyes. The system uses an Arduino-compatible RedBear Blend Micro to translate his eye movement into the needed infrared signals.
UPDATE: Most recently, Smith has created system with Arduino and custom software that enables him to maneuver his wheelchair around using his eyes through a Microsoft Surface.