UPDATE: Apple now has an opt-in for human review with iOS 13.2. It also gives users the option to delete their Siri voice recordings. The images below have been updated accordingly.
In September 2019, Google reactivated its human review program for the Google Home, now with user consent.
Read on to find out how to change your settings.
If you have a voice assistant in your home or on your phone, have you ever been concerned that someone from the company could listen to your voice recordings?
Recent news coverage confirms that suspicion.
At the end of July, The Guardian reported that people at Apple were regularly listening to recordings of deeply personal events such as conversations with doctors, sexual encounters, and other moments. While the effort was designed as a quality control measure, users likely had no idea that some of their utterances were being recorded and reviewed by humans.
Since then, Apple has temporarily suspended its human review program. Google has been forced to pause its own review program in the EU and Amazon is now giving users the ability to opt-out.
Mozilla has put together a guide for you to change your privacy settings on voice assistants.
Even with these additional privacy controls, there are still a number of concerns raised by these programs that haven’t yet been resolved. Some of those concerns are:
- For users who don’t opt-out, workers at Amazon and Google are still listening to a small segment of recordings from people’s smart voice assistants and despite efforts to anonymize that data, recordings can contain sensitive and personally identifiable information. Apple does a good job of clearly explaining how user data is stored and analyzed if users do opt-in, but it too has a human review program that even with its strong precautions for anonymization could give workers snippets of voice recordings containing personally identifable information.
- In many cases, recordings were made even without someone saying the wake word (“Hey Google”) or because they said something that sounded similar to the wake word (such as “Syria” – alerting Apple’s Siri). People may not have known they were being recorded once the device was triggered to listen.
- Until recent reporting on this issue, these review programs were not clearly disclosed to users and some like Amazon’s did not give users the ability to opt in/out. What’s more, news continues to break that other companies like Facebook are also employing human review of users’ voice content without previous disclosure. This raises questions about what meaningful consent should look like when people’s data is used to train a model to improve the product.
We will keep monitoring the developments on the issue, and of course advocate for disclosure and stronger privacy protections in publications like our Privacy Not Included guide and more. But in the meantime, it is important that consumers like you know how to set the privacy settings for your own voice assistant.
After you change your own settings, will you use the share buttons below to share the graphics with your friends and family?