Review Date 10/23/2019
Seven horn-loaded tweeters, six microphone array, one high-excursion woofer, and a partridge in a pear tree! This fancy smart speaker from Apple comes with six built in smart microphones. That's a lot of listening. Apple promises anything you say after "Hey Siri" is encrypted and sent anonymously to Apple's servers though, so you should be safe. Apple also claims this speaker has a big brain with spatial awareness and can learn all about the room you put it in to deliver the best sound. When you listen to those 11 drummers drumming they're gonna sound awesome.
Can it snoop on me?
Device: No | App: N/A
Device: Yes | App: N/A
Device: No | App: N/A
How does it handle privacy?
How does it share data?
All your Apple HomePod voice requests are anonymous and Apple auto-deletes them after 6 months. Apple does not share data with third parties for commercial or marketing purposes.
Can you delete your data?
Collects biometrics data?
May use voice recognition to identify your voice.
User friendly privacy info?
Apple's privacy information is written in fairly simple, easy-to-read language. They recently refreshed their privacy pages so it's centralized and written in a visual format.
What could happen if something went wrong
Apple does a good job with privacy and security and promises that it encrypts your voice requests, sends them anonymously, and automatically deletes them after six months. Apple was called out earlier this year when it came out that contractors with the company were listening in on people's intimate conversations as part of "quality control" for the Siri voice assistant. Apple did make changes to how they handle this quality control following the reports. However, it's always good to remember if you buy this product you're bringing six quality microphones that are always listening into your home.
How to contact the company
Apple apologises for allowing workers to listen to Siri recordings
Apple has apologised for allowing contractors to listen to voice recordings of Siri users in order to grade them. The company made the announcement after it completed a review of the grading programme, which had been triggered by a Guardian report revealing its existence. According to multiple former graders, accidental activations were regularly sent for review, having recorded confidential information, illegal acts, and even Siri users having sex.
Apple contractors 'regularly hear confidential details' on Siri recordings
Apple contractors regularly hear confidential medical information, drug deals, and recordings of couples having sex, as part of their job providing quality control, or “grading”, the company’s Siri voice assistant, the Guardian has learned.
How to opt out of human review of your voice assistant recordings
If you have a voice assistant in your home or on your phone, have you ever been concerned that someone from the company could listen to your voice recordings? Mozilla has put together a guide for you to change your privacy settings on voice assistants.