NC State Researchers Warn of Privacy Problems Surrounding Amazon Alexa Third-Party Skills

Third-party skills are Alexa's lifeblood, but poor privacy practices on Amazon's part could let malicious developers run riot.

Researchers from North Carolina State University and Ruhr-Universität Bochum have published a study highlighting a range of privacy concerns in Amazon's Alexa platform, concentrating on the use of third-party "skills."

"When people use Alexa to play games or seek information, they often think they’re interacting only with Amazon," explains assistant professor Anupam Das of the problems highlighted in the study. "But a lot of the applications they are interacting with were created by third parties, and we’ve identified several flaws in the current vetting process that could allow those third parties to gain access to users' personal or private information."

The study investigated 90,194 unique skills, the name Amazon gives to programs which run atop the Alexa platform to extend its capabilities. These were gathered using an automated program analyzing the content of seven different skill "stores" — which highlighted the very first issue: A lack of verification that the name presented as the developer of the skill is actually the name of the developer.

A second flaw was found in the way multiple different skills can be triggered using the same invocation phrase. "This is problematic because if you think you are activating one skill but are actually activating another, this creates the risk that you will share information with a developer that you did not intend to share information with," explains Das. "For example, some skills require linking to a third-party account, such as an email, banking, or social media account. This could pose a significant privacy or security risk to users."

Other flaws highlighted in the study include the ability for developers to modify their skills' code after it has been approved by Amazon, potentially adding unwanted features which would have been rejected on an initial review, and a lack of published privacy policies — with 23.3 percent of the analysed skills requesting access to privacy-sensitive data but either lacking or having a misleading or incomplete privacy policy, despite a robust policy being required by Amazon as a condition of use.

The researchers make a series of recommendations as to how Amazon could improve matters, including: Skill-type indicators, which would help to differentiate between skills which use the same invocation phrase; better validation of developers, to prevent skill publishers falsely claiming to be part of a trusted organisation; recurring back-end validation which will pick up when code has been changed after its initial approval on the platform, and the provision of a privacy policy template.

The paper, which was presented at the Network and Distributed Systems Security Symposium 2021, is available to download now under open-access terms in PDF format.

Gareth Halfacree
Freelance journalist, technical author, hacker, tinkerer, erstwhile sysadmin. For hire: freelance@halfacree.co.uk.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles