Last year, we reviewed the privacy and security of 27 mental health apps and we were shocked at how bad they were. Which is saying a lot for a team that reads privacy policies for a living. Twenty three of the apps we reviewed earned our *Privacy Not Included warning label at launch for failing to respect and protect their users’ privacy.

Now, it’s Mental Health Awareness month (in some parts of the world) again, and mental health apps are more popular than ever. The industry has grown by about a billion dollars since 2022, which explains why you might feel like you can’t scroll Reddit or listen to a podcast without seeing or hearing an ad for one.

So in 2023, we’re checking in again to see what, if anything, has changed. In addition to re-reviewing last year’s 27 mental health apps, we listened to our readers’ requests and reviewed five new apps: Cerebral, Insight Timer, Finch, Ginger, and Replika: My AI friend.

What’s the verdict? Some are better! Many are worse. And more than a few are downright creepy.

The apps that improved!

There is good news in the world of mental health apps. We did see some improvement.

  • Moodfit improved their security by requiring strong passwords. Last year we were able to login to apps with passwords like “1” or “111111” to too many apps. We still had that problem with a few apps, but others improved, mostly because we asked, and that’s no small thing.

  • Calm and Modern Health made improvements by clarifying in their privacy policies that all users, regardless of what privacy laws they live under, have the same rights to access and delete their data. This is great for people who don’t live under stronger privacy laws like California’s Consumer Privacy Act (CCPA) or the EU’s General Data Protection Regulation (GDPR). It’s important, especially with apps that collect as much sensitive information as mental health apps, that everyone be able to get that data deleted.

  • Woebot made improvements to both of those things: They updated their password policy, improving security, and updated their privacy policy to be much clearer about users’ data control. Now, it says clearly that you can delete your information no matter where you live.

  • Recovery Record updated their privacy policy to say that they won’t share your personal data with third parties.

  • Youper is our most improved app overall. Like Woebot, Youper now clearly states that all users have the right to delete their data and they updated their password requirement. They’ve also improved how they collect your data, how much of it, and what they do with it. Good work, Youper! All in all, we saw about eight of the 27 apps we reviewed last year improve their privacy in some way, big or small.

  • And the best news of all. Last year we had two apps -- the AI chatbot Wysa, and PTSD Coach -- make our “Best Of” list. Those two apps are still head and shoulders above the others apps in terms of privacy and security and remain “Best Of.”

The apps that got worse.

Unfortunately, we saw even more apps get worse at privacy and security. Around 17 of the 27 apps we reviewed last year either had worse or still pretty bad privacy and security practices in 2023. Apps like Headspace, BetterHelp and Pride Counseling (owned by BetterHelp), and Talkspace fall in this unhappy category.

We put together a little highlight reel of awful for you, with a few of the most worrisome things we learned, after hundreds of hours of research:

  • Betterhelp has a $7.8 million dollar judgment issue against them by the US regulatory agency the FTC for promising not to share sensitive mental health information for advertising and then turning around and doing just that. It’s great to see policy makers and regulators crack down on the super creepy practice of sharing sensitive health information to make money. Gross!
  • Talkspace buries in their privacy policy that they can use inferences about you made when you take their sign-up questionnaire that asks questions about things like your gender identity, sexual orientation, and whether you’re feeling depressed (all before you’re ever offered a privacy policy) for marking purposes, including tailored ads.
  • Sesame Workshop, the company that makes the Breathe, Think, Do with Sesame app for kids, changed their privacy policy that covers all their apps to mention that they can now gather information on users to their sites and combine that with data they get from data brokers, data enhancement companies, and social media sites.
  • Replika: My AI Friend is perhaps the worst app we’ve ever reviewed. It received all four of our privacy dings and doesn’t meet our Minimum Security Standards.

  • Some apps seem to no longer be supported, which means no security updates and who knows what’s happening to any data the apps might collect. Those apps include Better Stop Suicide, Liberate, and RAINN.

  • Meditation app Shine, created by two women of color to focus on serving groups underrepresented in wellness, was bought by Headspace Health. That means all the data Shine owned probably got transferred over to Headspace Health. And Headspace was one app whose privacy we considered worse this year.

At a glance: How the mental health apps stack up.

Here's how the apps did according to our privacy and security criteria.

What can you do?

So, the prognosis is not exactly great on mental apps once again in 2023. And we realize that reading about privacy can sometimes make you feel like you need to go full tin-can-phone analog to stay safe. But don’t give up on mental health apps! There’s lots you can do to protect yourself.

Choose apps that you can trust (when possible)

Read the reviews, we write ‘em just for you! They’ll help you sort through the good, bad, and just okay apps in terms of privacy.

In each review there’s custom tips on how to use those apps in a way that preserves more of your privacy. And that goes double for the ones with warning labels. Even though we don’t recommend using them, we understand they may be the only app your employer offers, or be helping you too much to completely give them up.

If you do have the choice, there’s two apps we can actually recommend: Wysa and PTSD Coach. Wysa offers multiple levels of care through the app, with an “AI coach” as the suggested first step. Unlike pretty much every other mental health app, the conversations between you and your virtual coach are totally private. And, you can delete them whenever you want. That means even the privacy-conscious can let their guard down. (Hello, Wysa? It’s me…)

PTSD Coach is another goodie. It was developed by the US Department of Veterans Affairs National Center for PTSD, but that doesn’t mean it’s just for US veterans. The app can be (and is) helpful to people suffering from post-traumatic stress disorder all around the world. Because it doesn’t collect any personal information, you don’t have to stress about that being shared. According to their website, it’s basically a resource hub, with “fact and self-help skills based on research.” Nice.

Not only do both of these apps avoid all of our “dings,” but they really seem to value protecting their users’ privacy. It’s like they’re not in the data business at all! Really, they’re great./

Take steps to protect your own privacy

Since most of the mental health apps we looked at are, let’s say, imperfect with their privacy practices, here are some ways you can protect your privacy in general.

Limit the information you share

Keep your apps on a strictly need-to-know basis. Where you can, just say less, especially if it won’t improve your experience. Know that things like your answers to super personal surveys and, in some cases, your chat transcripts, might not be private.

You can also limit the flow of data that’s often collected automatically, by:

  • Not connecting your socials to your apps or using them to sign in.
  • Using your phone to limit access by not giving the app permission to access your camera, microphone, images, or location unless it’s necessary. And in your privacy settings, “Limit ad tracking.”
  • Turning off ad personalization on Google.

Practice good cyber hygiene

These two are easy wins that can help you preserve the privacy of all your apps:

  • Keep them updated (so that you can benefit as soon as possible when security vulnerabilities are patched).
  • Always choose a strong and unique password. A password manager can help.

Ask to have your information deleted

Once you stop using the app, request to have your data deleted. You might think that deleting the app erases your data too, but, in our experience, it almost never works that way.

Help us send a message

No one should have to pay for mental healthcare with their privacy. Our goal is to encourage these apps – all of them – to use a higher standard of privacy than the super low bar set by law and the status quo.

And, it’s working. At least eight of the apps we reviewed last year improved their security or privacy policies after we asked them to. Together, we can do more!

Jen Caltrider

Jen Caltrider

Lors d’une période de relative improvisation pendant laquelle elle travaillait sur son diplôme de Master en Intelligence Artificielle, Jen a découvert qu’elle était davantage douée pour raconter des histoires que pour écrire du code. Cette prise de conscience a par la suite donné lieu à une carrière intéressante en tant que journaliste spécialisée dans les questions technologiques chez CNN. Mais sa véritable passion dans la vie a toujours été de laisser le monde un peu meilleur qu’elle ne l’avait trouvé. C’est pourquoi elle a créé et dirige encore aujourd’hui l’initiative *Confidentialité non incluse de Mozilla, pour défendre le droit à la vie privée du plus grand nombre.

Misha Rykov

Misha Rykov

Originaire de Kiev et aujourd’hui basé à Berlin, Misha a travaillé dans de grandes entreprises technologiques et de conseil en sécurité avant de rejoindre les équipes en charge des questions de confidentialité chez Mozilla. Il adore les enquêtes journalistiques et déteste par-dessus tout les politiques de confidentialité confuses. Misha prône un cadre réglementaire plus fort et plus intelligent en matière de confidentialité, ainsi qu’un Internet plus sûr.

Zoë MacDonald

Zoë MacDonald

Zoë est rédactrice et stratège spécialisée dans le numérique à Toronto, au Canada. Avant que sa passion pour les droits numériques ne la conduise chez Mozilla et plus particulièrement dans l’équipe de *Confidentialité non incluse, Zoë écrivait sur la cybersécurité et le commerce électronique. Lorsqu’elle n’est pas occupée à enquêter sur des sujets de confidentialité dans le cadre de son travail, elle surveille étroitement les appareils intelligents chez elle.

*Confidentialité non incluse