Apple Watch 6

Apple Watch 6

Apple $399 - $799
Bluetooth

Review date: 11/02/2020

The Apple Watch still reigns supreme in the world of smart watches. You've got all your email, text, phone calls, music, podcasts, and more right there on your wrist (as long as you have an iPhone, of course). And now you have even more health data. There's heart rate, sleep tracking, steps, calories, and this year Apple has added what's called pulse ox, which is basically how much oxygen is in your blood. In the era of coronavirus, knowing how much oxygen in your blood is actually a good thing. Apple has a pretty good track record of taking all this very personal data and keeping it safe, which we appreciate.

What could happen if something goes wrong

Apple does a pretty good job with privacy and security as a company. They don't share or sell your data and Apple takes special care to make sure your Siri requests aren't associated with you. Apple did face backlash in 2019 when it came to light that their contractors were regularly listening in on confidential personal conversations when they were reviewing the voice assistant's recordings. Apple changed their policy so users weren't automatically opted-in to human voice review. This device does track a whole bunch of biometric data including your heart rate, blood oxygen levels, menstrual cycle, hearing, breathing, and your heart's electrical signals. Better hope your insurance company never gets access to all that info cause that could get weird (and costly).

Privacy

Can it snoop on me?

Camera

Device: No

App: Yes

Microphone

Device: Yes

App: Yes

Tracks Location

Device: Yes

App: Yes

What is required to sign up?

What data does it collect?

How can you control your data?

You can request that data be deleted. You can go to https://privacy.apple.com/ to get a copy of your data, correct your data, or delete your account.

What is the company’s known track record for protecting users’ data?

Great

They actually changed their Siri voice recording review practices—from an opt out to an opt-in—when people told them they were unhappy having contractors listen to the recordings. Good for them!

Can this product be used offline?

Yes

A WiFi connection is required to set it up.

User friendly privacy information?

Yes

Links to privacy information

Security

Does this product meet our Minimum Security Standards?

Yes

Uses encryption in transit and at rest. After Apple recognizes the words “Hey Siri,” what you say is encrypted and sent anonymously to Apple servers without being tied to your Apple ID. Audio samples are only retained if you have opted-in.

Encryption

Yes

Uses encryption in transit and at rest. After Apple recognizes the words “Hey Siri,” what you say is encrypted and sent anonymously to Apple servers without being tied to your Apple ID. Audio samples are only retained if you have opted-in.

Strong password

Yes

Security updates

Yes

Manages vulnerabilities

Yes

Apple has a bug bounty program, which means that anyone who finds a security issue and discloses it responsibly may get paid. https://developer.apple.com/security-bounty/

Privacy policy

Yes

Apple has a webpage highlighting its privacy principles and features. Apple begins its privacy policy with a statement of principles. While this statement is very long, it is clearly broken out into relevant topics.

Artificial Intelligence

Does the product use AI?

Yes

Does the AI use your personal data to make decisions about you?

No

Does the company allow users to see how the AI works?

Yes

Apple employes machine learning in many different ways, from using it to to improve Siri to using it to sharpen the photos that you take. Apple states in its privacy policy, "Apple does not take any decisions involving the use of algorithms or profiling that significantly affect you." Some of its research can be found at https://machinelearning.apple.com/.

Company contact info

Phone Number

(800) 275–2273

Twitter

applesupport

Updates

Apple apologises for allowing workers to listen to Siri recordings
The Guardian
Apple has apologised for allowing contractors to listen to voice recordings of Siri users in order to grade them. The company made the announcement after it completed a review of the grading programme, which had been triggered by a Guardian report revealing its existence. According to multiple former graders, accidental activations were regularly sent for review, having recorded confidential information, illegal acts, and even Siri users having sex.
Apple’s AI plan: a thousand small conveniences
James Vincent
AI has become an integral part of every tech company’s pitch to consumers. Fail to hype up machine learning or neural networks when unveiling a new product, and you might as well be hawking hand-cranked calculators. This can lead to overpromising. But judging by its recent WWDC performance, Apple has adopted a smarter and quieter approach.
Improving Siri’s privacy protections
Apple
At Apple, we believe privacy is a fundamental human right. We design our products to protect users’ personal data, and we are constantly working to strengthen those protections. This is true for our services as well. Our goal with Siri, the pioneering intelligent assistant, is to provide the best experience for our customers while vigilantly protecting their privacy.
Apple resumes human reviews of Siri audio
Associated Press
Apple Inc. is resuming the use of humans to review Siri commands and dictation with the latest iPhone software update. In August, Apple suspended the practice and apologized for the way it used people, rather than just machines, to review the audio.

Related products