Researchers at the University of Electro-Communications in Tokyo (UEC Tokyo) and the University of Michigan have discovered a way to send voice commands to the MEMS microphones used in the majority of smart home voice assistants using silent laser light — making them inaudible to anyone else in the room.
Voice-activated computerised assistants have long been a staple of science fiction, but in recent years have started to appear in peoples' homes in the form of products from Amazon, Apple, Google, and others. Linked into a typically per-vendor ecosystem, these devices can be used for everything from playing music to controlling smart home accessories — including security systems and door locks.
It's these latter features that make the devices a tempting target for attack, and a team of researchers from the University of Electro-Communications in Tokyo (UEC Tokyo) and the University of Michigan have figured out a neat means of subtly controlling these assistants — by shining silent laser light on the microphone, including through windows.
"By shining the laser through the window at microphones inside smart speakers, tablets, or phones, a far away attacker can remotely send inaudible and potentially invisible commands which are then acted upon by Alexa, Portal, Google Assistant, or Siri," the researchers explain. "Making things worse, once an attacker has gained control over a voice assistant, the attacker can use it to break other systems. For example, the attacker can: Control smart home switches; open smart garage doors; make online purchases; remotely unlock and start certain vehicles; open smart locks by stealthily brute forcing the user's PIN number."
The trick works owing to the use of microelectro-mechanical systems (MEMS) microphones, tiny vibration-sensitive components which react to sound - and which, the researchers found, also react to light. By modulating the intensity of the light beam, the researchers were able to make the microphone react as though an audible command had been received — but in total silence.
It's a neat hack, but one which exposes a potentially serious security flaw. Speaking to Ars Technica, Google indicated that it was in contact with the researchers as it worked to understand more about the vulnerability; both Google and Amazon, meanwhile, indicated that the attacks aren't likely to affect users in the real world. Apple and Facebook, meanwhile, did not offer comment.
More information on the exploit, which the researchers indicate can also be carried out at shorter ranges using non-laser light including simple torches, can be found on the Light Commands website or in the official research paper.