New Attack Exfiltrates Sensitive Data From Voice Assistants Using "Inaudible" Telephone Calls

By encoding data as DTMF tones and then modulating them to inaudible frequencies, an Alexa becomes an unwitting carrier for stolen data.

Gareth Halfacree
3 years agoSecurity

Researchers at Georgia Tech have published a paper detailing an attack using voice assistant systems, like Amazon Alexa and Google Home, as "inaudible" data exfiltration mechanisms — encoding information in DTMF tones.

"New security and privacy concerns arise due to the growing popularity of voice assistant (VA) deployments in home and enterprise networks," the researchers explain in the paper's abstract. "A number of past research results have demonstrated how malicious actors can use hidden commands to get VAs to perform certain operations even when a person may be in their vicinity. However, such work has not explored how compromised computers that are close to VAs can leverage the phone channel to exfiltrate data with the help of VAs."

"After characterizing the communication channel that is set up by commanding a VA to make a call to a phone number, we demonstrate how malware can encode data into audio and send it via the phone channel. Such an attack, which can be crafted remotely, at scale and at low cost, can be used to bypass network defenses that may be deployed against leakage of sensitive data. We use Dual-Tone Multi-Frequency tones to encode arbitrary binary data into audio that can be played over computer speakers and sent through a VA mediated phone channel to a remote system."

The amount of data that can be transmitted through such an attack is, admittedly, limited: The researchers found that "modest amounts of data" could be exfiltrated "with high accuracy" using a phone call lasting a few minutes. The key to the attack: The audio is inaudible, or close enough, to anyone nearby, as it is modulated with a carrier that uses frequencies near the higher end of the hearing range of a human.

"The attack we identified can be carried out without being noticed by a human, demonstrating the feasibility of stealthy data exfiltration from compromised computers with smart speakers," first author Zhengxian He said in an interview with TechXplore, which alerted us to the study. "A modest amount of data (e.g., a kilobyte of data) can be transmitted with high accuracy by a call lasting less than 5 minutes in a realistic setting even when the smart speaker is several feet away from the computer where the data is stored."

The idea of attacking a voice assistant in a way that is inaudible to nearby users isn't new: Last year the Light Commands attack used lasers to trigger MEMS microphones in common voice assistant systems without any audio at all; SurfingAttack sends inaudible ultrasonic commands through the surface of the table on which a smart speaker rests; and Audio Hotspot crosses two ultrasonic beams to ensure that the spoken commands are audible only in the exact position of the speaker. To use the voice assistant as an exfiltration system for an infected system on the network, though, is certainly a new angle.

The team's paper is available on arXiv.org under open-access terms.

Gareth Halfacree
Freelance journalist, technical author, hacker, tinkerer, erstwhile sysadmin. For hire: freelance@halfacree.co.uk.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles