Last year, we reviewed the privacy and security of 27 mental health apps and we were shocked at how bad they were. Which is saying a lot for a team that reads privacy policies for a living. Twenty three of the apps we reviewed earned our *Privacy Not Included warning label at launch for failing to respect and protect their users’ privacy.
Now, it’s Mental Health Awareness month (in some parts of the world) again, and mental health apps are more popular than ever. The industry has grown by about a billion dollars since 2022, which explains why you might feel like you can’t scroll Reddit or listen to a podcast without seeing or hearing an ad for one.
So in 2023, we’re checking in again to see what, if anything, has changed. In addition to re-reviewing last year’s 27 mental health apps, we listened to our readers’ requests and reviewed five new apps: Cerebral, Insight Timer, Finch, Ginger, and Replika: My AI friend.
What’s the verdict? Some are better! Many are worse. And more than a few are downright creepy.
The apps that improved!
There is good news in the world of mental health apps. We did see some improvement.
- Moodfit improved their security by requiring strong passwords. Last year we were able to login to apps with passwords like “1” or “111111” to too many apps. We still had that problem with a few apps, but others improved, mostly because we asked, and that’s no small thing.
- Calm and Modern Health made improvements by clarifying in their privacy policies that all users, regardless of what privacy laws they live under, have the same rights to access and delete their data. This is great for people who don’t live under stronger privacy laws like California’s Consumer Privacy Act (CCPA) or the EU’s General Data Protection Regulation (GDPR). It’s important, especially with apps that collect as much sensitive information as mental health apps, that everyone be able to get that data deleted.
- Youper is our most improved app overall. Like Woebot, Youper now clearly states that all users have the right to delete their data and they updated their password requirement. They’ve also improved how they collect your data, how much of it, and what they do with it. Good work, Youper! All in all, we saw about eight of the 27 apps we reviewed last year improve their privacy in some way, big or small.
- And the best news of all. Last year we had two apps -- the AI chatbot Wysa, and PTSD Coach -- make our “Best Of” list. Those two apps are still head and shoulders above the others apps in terms of privacy and security and remain “Best Of.”
The apps that got worse.
Unfortunately, we saw even more apps get worse at privacy and security. Around 17 of the 27 apps we reviewed last year either had worse or still pretty bad privacy and security practices in 2023. Apps like Headspace, BetterHelp and Pride Counseling (owned by BetterHelp), and Talkspace fall in this unhappy category.
We put together a little highlight reel of awful for you, with a few of the most worrisome things we learned, after hundreds of hours of research:
- Betterhelp has a $7.8 million dollar judgment issue against them by the US regulatory agency the FTC for promising not to share sensitive mental health information for advertising and then turning around and doing just that. It’s great to see policy makers and regulators crack down on the super creepy practice of sharing sensitive health information to make money. Gross!
- Replika: My AI Friend is perhaps the worst app we’ve ever reviewed. It received all four of our privacy dings and doesn’t meet our Minimum Security Standards.
- Some apps seem to no longer be supported, which means no security updates and who knows what’s happening to any data the apps might collect. Those apps include Better Stop Suicide, Liberate, and RAINN.
- Meditation app Shine, created by two women of color to focus on serving groups underrepresented in wellness, was bought by Headspace Health. That means all the data Shine owned probably got transferred over to Headspace Health. And Headspace was one app whose privacy we considered worse this year.
At a glance: How the mental health apps stack up.
Here's how the apps did according to our privacy and security criteria.
What can you do?
So, the prognosis is not exactly great on mental apps once again in 2023. And we realize that reading about privacy can sometimes make you feel like you need to go full tin-can-phone analog to stay safe. But don’t give up on mental health apps! There’s lots you can do to protect yourself.
Choose apps that you can trust (when possible)
Read the reviews, we write ‘em just for you! They’ll help you sort through the good, bad, and just okay apps in terms of privacy.
In each review there’s custom tips on how to use those apps in a way that preserves more of your privacy. And that goes double for the ones with warning labels. Even though we don’t recommend using them, we understand they may be the only app your employer offers, or be helping you too much to completely give them up.
If you do have the choice, there’s two apps we can actually recommend: Wysa and PTSD Coach. Wysa offers multiple levels of care through the app, with an “AI coach” as the suggested first step. Unlike pretty much every other mental health app, the conversations between you and your virtual coach are totally private. And, you can delete them whenever you want. That means even the privacy-conscious can let their guard down. (Hello, Wysa? It’s me…)
PTSD Coach is another goodie. It was developed by the US Department of Veterans Affairs National Center for PTSD, but that doesn’t mean it’s just for US veterans. The app can be (and is) helpful to people suffering from post-traumatic stress disorder all around the world. Because it doesn’t collect any personal information, you don’t have to stress about that being shared. According to their website, it’s basically a resource hub, with “fact and self-help skills based on research.” Nice.
Not only do both of these apps avoid all of our “dings,” but they really seem to value protecting their users’ privacy. It’s like they’re not in the data business at all! Really, they’re great.
Take steps to protect your own privacy
Since most of the mental health apps we looked at are, let’s say, imperfect with their privacy practices, here are some ways you can protect your privacy in general.
Limit the information you share
Keep your apps on a strictly need-to-know basis. Where you can, just say less, especially if it won’t improve your experience. Know that things like your answers to super personal surveys and, in some cases, your chat transcripts, might not be private.
You can also limit the flow of data that’s often collected automatically, by:
- Not connecting your socials to your apps or using them to sign in.
- Using your phone to limit access by not giving the app permission to access your camera, microphone, images, or location unless it’s necessary. And in your privacy settings, “Limit ad tracking.”
- Turning off ad personalization on Google.
Practice good cyber hygiene
These two are easy wins that can help you preserve the privacy of all your apps:
- Keep them updated (so that you can benefit as soon as possible when security vulnerabilities are patched).
- Always choose a strong and unique password. A password manager can help.
Ask to have your information deleted
Once you stop using the app, request to have your data deleted. You might think that deleting the app erases your data too, but, in our experience, it almost never works that way.
Help us send a message
No one should have to pay for mental healthcare with their privacy. Our goal is to encourage these apps – all of them – to use a higher standard of privacy than the super low bar set by law and the status quo.
And, it’s working. At least eight of the apps we reviewed last year improved their security or privacy policies after we asked them to. Together, we can do more!
During a rather unplanned stint working on my Master’s degree in Artificial Intelligence, I quickly discovered I’m much better at telling stories than writing code. This discovery led to an interesting career as a journalist covering technology at CNN. My true passion in life has always been to leave the world a little better than I found it. Which is why I created and lead Mozilla's *Privacy Not Included work to fight for better privacy for us all.
Kyiv-native and Berlin-based, Misha worked in big tech and security consulting, before joining Mozilla's privacy effort. Misha loves investigative storytelling and hates messy privacy policies. Misha is an advocate for stronger and smarter privacy regulations, as well as for safer Internet.
Zoë is a writer and digital strategist based in Toronto, Canada. Before her passion for digital rights led her to Mozilla and *Privacy Not Included, she wrote about cybersecurity and e-commerce. When she’s not being a privacy nerd at work, she’s side-eyeing smart devices at home.