How to opt out of human review of your voice assistant recordings

By Becca Ricks and Kaili Lambe | August 14, 2019

If you have a voice assistant in your home or on your phone, have you ever been concerned that someone from the company could listen to your voice recordings?

Recent news coverage confirms that suspicion.

At the end of July, The Guardian reported that people at Apple were regularly listening to recordings of deeply personal events such as conversations with doctors, sexual encounters, and other moments. While the effort was designed as a quality control measure, users likely had no idea that some of their utterances were being recorded and reviewed by humans.

Since then, Apple has temporarily suspended its human review program. Google has been forced to pause its own review program in the EU and Amazon is now giving users the ability to opt-out.

Mozilla has put together a guide for you to change your privacy settings on voice assistants.

Which voice assistants use human review of recordings
Amazon Alexa opt out
Google Home opt out

Even with these additional privacy controls, there are still a number of concerns raised by these programs that haven’t yet been resolved. Some of those concerns are:

  1. For users who don’t opt-out, workers at Amazon and Google are still listening to a small segment of recordings from people’s smart voice assistants and despite efforts to anonymize that data, recordings can contain sensitive and personally identifiable information.
  2. In many cases, recordings were made even without someone saying the wake word (“Hey Google”) or because they said something that sounded similar to the wake word (such as “Syria” – alerting Apple’s Siri). People may not have known they were being recorded once the device was triggered to listen.
  3. Until recent reporting on this issue, these review programs were not clearly disclosed to users and some like Amazon’s did not give users the ability to opt in/out. What’s more, news continues to break that other companies like Facebook are also employing human review of users’ voice content without previous disclosure. This raises questions about what meaningful consent should look like when people’s data is used to train a model to improve the product.

We will keep monitoring the developments on the issue, and of course advocate for disclosure and stronger privacy protections in publications like our Privacy Not Included guide and more. But in the meantime, it is important that consumers like you know how to set the privacy settings for your own voice assistant.

After you change your own settings, will you use the share buttons below to share the graphics with your friends and family?