Apple Watch

Apple Watch

Apple
Bluetooth

Review date: Nov. 8, 2021

|
|

Mozilla says

|
People voted: Somewhat creepy

The Apple Watch still reigns supreme in the world of smart watches. You've got all your email, text, phone calls, music, podcasts, and more right there on your wrist (as long as you have an iPhone, of course). And it tracks lots of health data. There's heart rate, sleep tracking, steps, calories, blood oxygen levels, ECG, fall detection, and more. Apple has a pretty good track record of taking all this very personal data and keeping it safe, which we appreciate.

What could happen if something goes wrong?

Apple does a pretty good job with privacy and security as a company. They say they don't share or sell your data and Apple takes special care to make sure your Siri requests aren't associated with you, which is great. Apple did face backlash in 2019 when it came to light their contractors were regularly listening in on confidential personal conversations when they were reviewing the voice assistant's recordings. Apple changed their policy so users weren't automatically opted-in to human voice review. Recently, Apple made another positive change for your Siri voice requests -- many audio requests for things like setting timers or alarms or controlling music will no longer be sent over the internet to their servers, instead processing them directly on the device. This is better for your privacy.

This device does track a whole bunch of biometric data including your heart rate, blood oxygen levels, menstrual cycle, hearing, breathing, and your heart's electrical signals. That's a lot of personal information gathered in one place. A reminder, it’s always good to lock down the privacy on all this data as much as possible.

What is not good is what can happen with all this very personal health data if others aren't careful. A recent report showed that health data for over 61 million fitness tracker users, including both Fitbit and Apple, was exposed when a third party company that allowed users to sync their health data from their fitness trackers did not secure the data properly. Personal information such as names, birthdates, weight, height, gender, and geographical location for Apple and other fitness tracker users was left exposed because the company didn't password protect or encrypt their database. This is a great reminder that yes, while Apple might do a good job with their own security, anytime you sync or share that data with anyone else, it could be vulnerable. I don't know about you, but I don't need the world to know my weight and where I live. That’s really dang creepy.

Tips to protect yourself

  • Restrict the amount of personal information like heart rate data is shared by going to the Apple Watch app on your iPhone under: Privacy > Health
  • Be very careful what third party companies you consent to share you health data with. If you do decided to share your health data with another company, read their privacy policy to see how they protect, secure, and share or sell your data.
mobile Privacy Security A.I.

Can it snoop on me? information

Camera

Device: No

App: Yes

Microphone

Device: Yes

App: Yes

Tracks location

Device: Yes

App: Yes

What can be used to sign up?

What data does the company collect?

How does the company use this data?

Apple says it does not share your data with third parties for commercial or marketing purposes. In June 2021, Apple announced that it will no longer send Siri requests to its servers, but instead will process them at the device level.

How can you control your data?

Apple retains personal data only for so long as necessary to fulfill the purposes for which it was collected, including as described in their Privacy Policy or in their service-specific privacy notices, or as required by law. No specific data retention details are provided.

What is the company’s known track record of protecting users’ data?

Average

Unfortunately, Apple's security measures did not prevent the major data leak of 61 million fitness tracker data records, including Apple's Healthkit data, by the third party company GetHealth. In September 2021, a group of security researchers discovered GetHealth had an unsecured database containing over 61 million records related to wearable technology and fitness services. GetHealth accessed health data belonging to wearable device users around the world and leaked it in an non-password protected, unencrypted database. The list contained names, birthdates, weight, height, gender, and geographical location, as well as other medical data, such as blood pressure.

Can this product be used offline?

Yes

User-friendly privacy information?

Yes

Links to privacy information

Does this product meet our Minimum Security Standards? information

Yes

Encryption

Yes

Uses encryption in transit and at rest. After Apple recognizes the words “Hey Siri,” what you say is encrypted and sent anonymously to Apple servers without being tied to your Apple ID. Audio samples are only retained if you have opted-in.

Strong password

Yes

Security updates

Yes

Manages vulnerabilities

Yes

Apple has a bug bounty program, which means that anyone who finds a security issue and discloses it responsibly may get paid.

Privacy policy

Yes

Does the product use AI? information

Yes

Some of Apple's AI research can be found at https://machinelearning.apple.com/.

Is this AI untrustworthy?

Can’t Determine

What kind of decisions does the AI make about you or for you?

Apple states in its privacy policy, "Apple does not take any decisions involving the use of algorithms or profiling that significantly affect you." Apple employs machine learning in many different ways, from using it to to improve Siri to using it to sharpen the photos that you take.

Is the company transparent about how the AI works?

Yes

Does the user have control over the AI features?

Can’t Determine


News

Improving Siri’s privacy protections
Apple
At Apple, we believe privacy is a fundamental human right. We design our products to protect users’ personal data, and we are constantly working to strengthen those protections. This is true for our services as well. Our goal with Siri, the pioneering intelligent assistant, is to provide the best experience for our customers while vigilantly protecting their privacy.
Apple resumes human reviews of Siri audio
Associated Press
Apple Inc. is resuming the use of humans to review Siri commands and dictation with the latest iPhone software update. In August, Apple suspended the practice and apologized for the way it used people, rather than just machines, to review the audio.
Apple’s AI plan: a thousand small conveniences
The Verge
AI has become an integral part of every tech company’s pitch to consumers. Fail to hype up machine learning or neural networks when unveiling a new product, and you might as well be hawking hand-cranked calculators. This can lead to overpromising. But judging by its recent WWDC performance, Apple has adopted a smarter and quieter approach.
Apple apologises for allowing workers to listen to Siri recordings
The Guardian
Apple has apologised for allowing contractors to listen to voice recordings of Siri users in order to grade them. The company made the announcement after it completed a review of the grading programme, which had been triggered by a Guardian report revealing its existence. According to multiple former graders, accidental activations were regularly sent for review, having recorded confidential information, illegal acts, and even Siri users having sex.

Comments

Got a comment? Let us hear it.