FTC: Alexa Data Abuse
Author:
• Elisa Jillson, U.S. Federal Trade Commission (FTC) – reprinted
Article Abstract
The Federal Trade Commission (FTC) underscores critical violations in Amazon and Ring’s handling of consumers’ private data, voice recordings, and video footage. Amidst AI’s burgeoning presence, the complaints highlight companies’ negligence towards customer privacy, breaching the FTC’s Biometric Policy Statement. Emphasizing the pivotal nexus of AI and privacy, the FTC urges caution in data collection, stressing the need for consumer control.
The complaints unveil lapses in consent mechanisms, inadequate data control, and compromised security, prompting the FTC to warn companies about the repercussions of mishandling sensitive data. Protection of biometric data, safeguarding kids’ privacy, and lawful data acquisition emerge as FTC’s priority areas, ensuring enforcement against data misuse.
From the FTC: Hey, Alexa! What are you doing with my data?
What you say in your home, what you do in your home. It doesn’t get more private than that. But, according to two recent FTC complaints, Amazon and Ring used this highly private data – voice recordings collected by Amazon’s Alexa voice assistant and videos collected by Ring’s internet-connected home security cameras – to train their algorithms while giving short shrift to customers’ privacy. These matters, the first announced since the FTC’s new Biometric Policy Statement, contain important lessons for companies using AI, biometric data, and other sensitive information.
AI and privacy should work hand-in-hand
In this age of AI, developers want more and more data – oftentimes, no matter its source. But be careful when collecting or keeping consumer data. Under Section 5’s unfairness standard, the FTC doesn’t look just at AI’s potential benefits, but also at the costs to consumers. According to the complaints, Amazon and Ring failed that test. The FTC alleged Ring’s data access practices enabled spying and harassment, while Amazon’s permanent retention of voice data and shoddy deletion practices exposed consumers’ voice recordings to the risk of unnecessary employee access. The message for businesses: The FTC will hold companies accountable for how they obtain, retain, and use the consumer data that powers their algorithms. As the Commissioners put it in their joint statement in the Alexa matter, machine learning is not a license to break the law.
Consumers – not companies – control their data
Some companies think they’re free to use personal data in their possession for any purpose they choose. Not so fast. The FTC complaints against Amazon and Ring make clear that companies that ignore consumers’ rights to control their data do so at their peril. In its complaint, the FTC says Ring gave all employees and contractors access to customers’ videos to train algorithms (among other things) with only check-the-box “consent.” But that’s not enough to ensure that users are really in control of what happens to their information. And in the Amazon complaint, the FTC says Amazon undermined parents’ rights under the Children’s Online Privacy Protection Act (COPPA) Rule to delete their children’s voice recordings. Parents have the right under the COPPA Rule to decide what data about their children is stored by a company, and what data is deleted. The upshot is clear: Any company that undermines consumer control of their data can face FTC enforcement action.
Place special safeguards on human review and employee access to sensitive data
AI developers often rely on human reviewers to tag and annotate the data that trains machine learning algorithms. But do consumers know when their data is under review? In its complaint, the FTC says Ring hid this review from its customers and let reviewers abuse their access to consumers’ videos. As a result, Ring’s customers – who bought Ring’s products for more security – ended up being the target of Ring employees’ spying and surveillance. The Amazon complaint also says that Amazon didn’t use appropriate controls to limit which employees could access Alexa users’ voice recordings, so thousands of employees had access to sensitive voice recordings that they didn’t need. Companies relying on human review are on notice that safeguards for sensitive data, including strict access controls, can’t be an afterthought. They should be the first step.
The FTC protects biometric data
Last month, the FTC issued a policy statement on the protection of biometric data. That statement explains that biometric data – whether fingerprints and iris scans or videos and voice recordings – deserves the utmost protection because of its inherent sensitivity and the potential for bias, discrimination, and other harmful uses. The FTC’s settlements with Amazon and Ring underscore that when the FTC says protecting biometric data is a priority, it means what it says – and the Commission will back up that policy with enforcement action.
The FTC uses every tool available to protect kids’ privacy
After a series of enforcement actions about kids’ and teen privacy (think Microsoft, Epic Games, Edmodo, Weight Watchers (Kurbo), and Chegg), it should be clear that protecting kids is a top FTC priority. That’s especially true at the intersection of AI and kid and teen privacy. In the Amazon complaint, the FTC says Amazon was keeping kids’ voice recordings (both audio files and transcripts) permanently and undermining parents’ deletion rights. According to the complaint, Amazon could then use that data for natural language processing. In the Ring complaint, the FTC describes Ring’s cavalier approach to privacy and security, notwithstanding the fact that its cameras were marketed to watch over kids’ bedrooms. The FTC’s response? No dice. The FTC will use every available tool – including the COPPA Rule and the FTC Act’s prohibitions on deceptive and unfair practices – to protect kids’ privacy.
Want to keep your algorithms and data products? Get the data lawfully
With Ring and Alexa, as well as Kurbo, Cambridge Analytica, and Everalbum, the FTC has obtained numerous orders requiring companies to delete data and delete or refrain from generating data products, like algorithms, models, and other tools derived from ill-gotten data. These actions make clear that there are no free passes for data abuse. If you illegally obtain or misuse consumer data, you may well pay with your data product.
-/ 30 /-
What do you think about this?
Please share your thoughts in a comment below!
Leave A Comment