PNI
Today, Mozilla is publishing the second edition of *Privacy Not Included, mental health apps

Despite progress in a category dealing with highly sensitive personal data, 19 out of 32 mental health apps are still slapped with *Privacy Not Included warning labels

(SAN FRANCISCO, CA | TUESDAY, MAY 2) — As demand for mental health services continues to rise, Mozilla’s latest *Privacy Not Included research reveals mental health apps are failing to protect user privacy and security. Fifty-nine percent of the top apps investigated were slapped with *Privacy Not Included warning labels, while 40 percent have gotten worse since one year ago, according to the research released today for Mental Health Awareness Month this May.

Researchers, however, did uncover some bright spots. Exposing shady data practices motivated some tech companies to do better: nearly one-third of the apps made some improvements over their 2022 performance. And two apps — PTSD Coach and the AI chatbot Wysa — received a “Best Of” citation, which Mozilla uses to spotlight the apps doing privacy and security right. Over 255 hours of research, including more than eight hours of research per product, went into creating the 2023 mental health app guide.

For the first time ever, *Privacy Not Included now ranks year-over-year performance, highlighting if the apps’ privacy and security features have improved or worsened. Launched in 2017, *Privacy Not Included has a two-fold mission: to arm consumers with the information they need to choose products that protect their privacy, and to spur the tech industry to do more to safeguard consumers. Over the past six years, Mozilla has reviewed more than 100 apps and 300 internet-connected devices under the initiative.

Says Jen Caltrider, Mozilla’s *Privacy Not Included Lead: “Our main goal is better protection for consumers, so we were encouraged to see that some apps made changes that amount to a better privacy for the public. And sometimes all that had to be done to make those positive changes was to ask the companies to do better. But the worst offenders are still letting consumers down in scary ways, tracking and sharing their most intimate information and leaving them incredibly vulnerable. The handful of apps that handle data responsibly and respectfully prove that it can be done right.”

Says Misha Rykov, Research Associate: “Poor privacy and security is inexcusable for any
connected product. And this is especially true for mental health apps, which deal with incredibly sensitive data. While the industry practices are improving slowly, it’s not happening nearly fast enough. Consumers should think twice and read the fine print before engaging with one of these apps.”

The worst offenders are still letting consumers down in scary ways, tracking and sharing their most intimate information and leaving them incredibly vulnerable.

Jen Caltrider, *Privacy Not Included Lead

Key findings include:

Several popular apps, including Youper and Woebot, made positive changes. Youper is in the running for “most-improved app” for significantly strengthening both its password requirements and privacy policy. Woebot also updated its privacy policy to explain that all users now have the same rights to access and delete their own data. Another app, Modern Health, improved its policy’s transparency by now stating clearly that it doesn’t “sell, disclose, and/or share personal information to other businesses or third parties for monetary or valuable consideration.”

On the other end of the scale are apps like Replika: My AI Friend, which is one of the worst apps Mozilla has ever reviewed. It’s plagued by weak password requirements, sharing of personal data with advertisers, and recording of personal photos, videos, and voice and text messages consumers shared with the chatbot. Many other apps were packed with trackers, with the app Cerebral setting a new record for number of trackers: 799 within the first minute of download. Several apps — Talkspace, Happify, and BetterHelp — pushed consumers into taking questionnaires up front without asking for consent or showing their privacy policies first.

Mozilla’s 2022 mental health apps investigation returned dismal results, with researchers saying that mental health apps were “worse than any other product category” when it comes to privacy and security. That year, 29 out of 32 earned the *Privacy Not Included warning label.

The apps that Mozilla investigated connect users with therapists; feature AI chat bots; run community support pages; offer mood journals and well-being assessments; and more. Despite these apps dealing with incredibly sensitive issues — like depression, anxiety, suicidal thoughts, domestic violence, eating disorders, and PTSD — the worst of them routinely share data, target vulnerable users with personalized ads, allow weak passwords, and feature vague and poorly written privacy policies.