Review date: Nov. 8, 2021
The Apple Watch still reigns supreme in the world of smart watches. You've got all your email, text, phone calls, music, podcasts, and more right there on your wrist (as long as you have an iPhone, of course). And it tracks lots of health data. There's heart rate, sleep tracking, steps, calories, blood oxygen levels, ECG, fall detection, and more. Apple has a pretty good track record of taking all this very personal data and keeping it safe, which we appreciate.
What could happen if something goes wrong?
Apple does a pretty good job with privacy and security as a company. They say they don't share or sell your data and Apple takes special care to make sure your Siri requests aren't associated with you, which is great. Apple did face backlash in 2019 when it came to light their contractors were regularly listening in on confidential personal conversations when they were reviewing the voice assistant's recordings. Apple changed their policy so users weren't automatically opted-in to human voice review. Recently, Apple made another positive change for your Siri voice requests -- many audio requests for things like setting timers or alarms or controlling music will no longer be sent over the internet to their servers, instead processing them directly on the device. This is better for your privacy.
This device does track a whole bunch of biometric data including your heart rate, blood oxygen levels, menstrual cycle, hearing, breathing, and your heart's electrical signals. That's a lot of personal information gathered in one place. A reminder, it’s always good to lock down the privacy on all this data as much as possible.
What is not good is what can happen with all this very personal health data if others aren't careful. A recent report showed that health data for over 61 million fitness tracker users, including both Fitbit and Apple, was exposed when a third party company that allowed users to sync their health data from their fitness trackers did not secure the data properly. Personal information such as names, birthdates, weight, height, gender, and geographical location for Apple and other fitness tracker users was left exposed because the company didn't password protect or encrypt their database. This is a great reminder that yes, while Apple might do a good job with their own security, anytime you sync or share that data with anyone else, it could be vulnerable. I don't know about you, but I don't need the world to know my weight and where I live. That’s really dang creepy.
Tips to protect yourself
- Restrict the amount of personal information like heart rate data is shared by going to the Apple Watch app on your iPhone under: Privacy > Health
What can be used to sign up?
What data does the company collect?
Name, contact information, address
Heart rate, movement, blood oxygen levels, sleep data, voice recordings if you use voice commands
How does the company use this data?
How can you control your data?
What is the company’s known track record of protecting users’ data?
Unfortunately, Apple's security measures did not prevent the major data leak of 61 million fitness tracker data records, including Apple's Healthkit data, by the third party company GetHealth. In September 2021, a group of security researchers discovered GetHealth had an unsecured database containing over 61 million records related to wearable technology and fitness services. GetHealth accessed health data belonging to wearable device users around the world and leaked it in an non-password protected, unencrypted database. The list contained names, birthdates, weight, height, gender, and geographical location, as well as other medical data, such as blood pressure.
Can this product be used offline?
User-friendly privacy information?
Links to privacy information
Does this product meet our Minimum Security Standards?
Uses encryption in transit and at rest. After Apple recognizes the words “Hey Siri,” what you say is encrypted and sent anonymously to Apple servers without being tied to your Apple ID. Audio samples are only retained if you have opted-in.
Apple has a bug bounty program, which means that anyone who finds a security issue and discloses it responsibly may get paid.
Some of Apple's AI research can be found at https://machinelearning.apple.com/.
Is this AI untrustworthy?
What kind of decisions does the AI make about you or for you?
Is the company transparent about how the AI works?
Does the user have control over the AI features?
Got a comment? Let us hear it.