Companies ignore, assume, or coerce your “consent” while they bury the rules of the game in a privacy policy they know no one will read. It’s annoying. It’s creepy. And it’s why we’re here.
“You might expect apps that handle very sensitive information — about your period and your anxiety — to treat your consent more seriously… After combing through 65 mental and reproductive health apps’ privacy policies, we learned they mostly don’t.”
Misha Rykov, Researcher @ *Privacy Not Included
For apps with health information to be so cavalier with consent crosses the line, from creepy to harmful. Here are some of the consent trickery techniques we found, how you can spot them for yourself, and what you can do to better understand what you might be “consenting” to.
What it looks like: hopping right into personal surveys before you’ve had a chance to see how this information will be protected (or not).
When you land on a page that starts with a survey or a chatbot, and you haven’t agreed to or been offered a privacy policy yet, you might assume that information is (just) being used to match you with the right product or service. It’s not totally kosher for those answers to be collected without you knowing how they’re going to be stored and used. But it’s especially bad when that information is shared for a reason you probably wouldn’t agree to if you had the choice.
In Talkspace’s case their privacy policy says, “Inferences about your interests are derived from your responses to surveys [like the one above] completed prior to becoming a subscriber” and are cited as a source of information that can be used for targeted ads.
They’re not the only mental health app who put a super personal survey ahead of their privacy policy – Betterhelp, Happify and Youper do it too.
What it looks like: privacy policies that say:
“We do not generally disclose your personal information to any third party without your specific consent, except as permitted or required by law.”
“We may receive Personal Information about you from other sources with your consent or as permitted by applicable law.”
“Except” and “or” are two words you don’t want to see following a promise. The thing is, as we’ll talk about later, data privacy law doesn’t usually cover consent in much detail and is generally pretty permissive. So when apps say “we won’t do it unless it’s legal to do it,” they’re not saying much at all.
It’s honestly even hard for us to tell what they mean, but where there are doubts there are worries because these little loopholes leave the door open for your personal data to be, in these two examples, shared, or collected from “other sources.”
And that’s something that does scare us, because when your data is combined, shared, bought or sold it’s often not going to be for reasons that benefit you. Sometimes, it can be used to serve you way-too-tailored ads. But the worst case scenario is, well, a lot worse than that. Depending on where you live, the consequence of your sexual orientation, pregnancy, or other private health information being shared could be a threat to your freedom, even your life.
What it looks like: privacy policies that say:
“By logging in, or using the Application, You agree to be bound by these Terms, the Privacy Policy including any additional guidelines and future modifications.”
This consent model basically says that if you’re using our app, it’s because you agree to our rules (which you may or may not be aware of, but probably aren’t). It’s hard to spot, because even when apps don’t say it outright, like Maya Period, Fertility, Ovulation, & Pregnancy does, many do “assume” users have read and agreed to their privacy policies and take that to mean you “consent” to what’s inside.
In their privacy policy, WebMD Pregnancy “urge[s] you to read [it] carefully” before you install their apps or use their services, which includes even visiting their websites. But you’d already have to be visiting their website to read that in the privacy policy. Call us sticklers but we believe it should be possible for consumers to know what they’re consenting to before companies are allowed to assume they’ve consented.
“Implicit” consent, as it's usually called, is okay sometimes. It makes more sense when you’re well informed or you would probably agree anyway. If you’re walking into a store and there’s a sign above your head that says “smile, you’re on camera,” you’ve implicitly consented to being filmed by entering. Being surveilled stinks but at least you’re aware of it before you walk in.
For reproductive health apps to take the same approach to consent isn’t fair game. There’s no “sign above the door” that warns you before you download the app, it’s too often buried in the privacy policy. And even if you feel like how you act in a public store isn’t private, you probably wouldn’t put the information these apps collect — like your birthday, health information, weight, or sex life details — in that category.
… And yet, both of these apps assume they have your permission to not just store but go ahead and share some of your personal information – with “business partners” or for advertising. Yeesh. That feels like a leap.
In this case, you do technically have the choice to opt out, but it’s not easy. And let’s face it, even when you’re not struggling with your mental health, reading complicated fine print and sorting through hard-to-find settings is a lot to ask. So it doesn’t feel fair for companies to use that to their advantage. Apps do though, by using design to nudge you toward the choice they’d prefer you make, “consenting” to what you otherwise wouldn’t.
Isn’t this somehow against the law?
Mostly, no. But sometimes yes. It depends where you live. The EU’s General Data Protection Regulation (GDPR) has a very strong stance on consent for data processing and what, specifically, that means. Consent has to be an “affirmative action,” that’s “freely given” and “informed” – among other requirements. It’s a version of consent that just makes sense.
But even Europeans aren’t totally out of the woods. “Consent” is only one of six reasons for data processing under the GDPR. So companies can try to shoe-horn their use of your data into a different “legitimate reason,” besides consent. Meta tried to do this by arguing that users want their data to be used for personalized ads, saying it’s part of what people sign up for when they join Facebook, Instagram, and WhatsApp. Hmm.
Most other countries, including the United States, either don’t have a federal personal data privacy law, or it’s a bit ambiguous on consent. In the US, that leaves the limited Health Insurance Portability and Accountability Act (HIPAA) to do the heavy lifting on protecting health information, while other kinds of data privacy protections vary by state. But even the strongest of those, like California’s Consumer Privacy Act (CCPA), leave room for murkier kinds of consent. So does Canada’s equivalent Personal Information Protection and Electronic Documents Act (PIPEDA).
And when consent is protected by the law? It’s still easy for apps to fall between the cracks.
“You expect that it's like the safety of ketchup in the supermarket — that when an app is on the market it meets some minimum safe data practices. But it doesn't work that way. Data protection authorities are bureaucratic and slow-moving while the scale of digital products is massive. They can’t keep up.”
Misha Rykov, Researcher @ *Privacy Not Included
This means that the “protection” the law offers often comes too late, as a penalty or a fine after your private information has already been exposed. Like when, just earlier this year:
- Mental health app Cerebral admitted to a HIPAA violation, by exposing the health data of the private personal health information of over 3.1 million patients with social media sites like Facebook and TikTok.
- BetterHelp got in trouble with the US’ Federal Trade Commission (FTC) for sharing health data it had promised to keep private — including information about mental health challenges — with companies including Facebook and Snapchat. BetterHelp said the settlement (which included a payment of $7.8M) was not an admission of wrongdoing and that the behavior for which it was sanctioned is standard for the industry.
What can you do about it?
No one should have to pay for wellness with their privacy. And it shouldn’t be on you to defend your data from companies that seem to have invented a totally new meaning for the word “consent.” But, until these pretty common practices are abandoned, there are steps you can take to better protect yourself and your community.
To get more detailed privacy tips, check out our individual mental health and reproductive app reviews.
Want to make an even bigger impact? Join us as a monthly donor today to help provide sustainable funding so we can fight for better tech all year round!
Misha Rykov
Misha Rykov, ursprünglich aus Kiew und aktuell in Berlin ansässig, arbeitete für Big Tech und Sicherheits-Consulting, bevor er sich Mozillas Initiative für mehr Datenschutz anschloss. Misha begeistert sich für investigatives Storytelling und verabscheut unübersichtliche Datenschutzrichtlinien.
Zoë MacDonald
Zoë MacDonald ist eine Writerin und Digitalstrategin und lebt in Toronto, Kanada. Bevor ihre Leidenschaft für digitale Rechte sie zu Mozilla und *Datenschutz nicht inbegriffen führte, schrieb sie über Cybersicherheit und E-Commerce. Wenn Sie nicht gerade bei der Arbeit über Datenschutz abnerdet, beäugt sie zu Hause Smart-Geräte misstrauisch.
Jen Caltrider
Als ich eher unorganisiert an meinem Master in Künstlicher Intelligenz arbeitete, wurde mir schnell klar, dass ich viel besser Geschichten erzählen kann, als Code zu schreiben. Diese Entdeckung bescherte mit eine Karriere als Journalistin, in der ich für CNN über die Tech-Branche berichtete. Ich wollte schon immer bewirken, dass die Welt nach mir ein etwas besserer Ort ist, als die, in der ich aufgewachsen bin. Deshalb habe ich Mozillas *Datenschutz nicht inbegriffen-Initiative ins Leben gerufen und geleitet – für besseren Datenschutz für alle.