Mental health apps have proven to be a huge resource for young adults during the pandemic — a time when the majority of the population has experienced increased anxiety, depression, loneliness, and other mental health concerns. While these apps are meant to be a safe space for young users to find care and support, too many of them lack the necessary privacy protections to keep sensitive information protected.

Mozilla reviewed 32 mental health apps and found that 28 share users’ personal information with third-party companies. This is particularly concerning for mental health apps targeted at teens who use the apps in a vulnerable state and may not be well-equipped to read privacy documents and understand what they’re signing up for.

Check out more in our *Privacy Not Included mental health apps guide

“If the user is under the age of 13, the app’s privacy policy must comply with the Children’s Online Privacy Protection Act (COPAA), which requires the app to obtain parental consent before collecting the child’s information,” says Bethany Corbin, a privacy and digital health attorney at Nixon Gwilt Law. But teens and young adults between the ages of 13 and 17 do not have similar protections. “This means users over the age of 13 are treated as adults and are expected to comprehend lengthy and confusing privacy policies to understand their data rights.”

Because of these complicated privacy disclosures, “teens and young adults often do not understand the gravity of the information they are sharing in these mental health apps nor do they possess the necessary skills to discern the poor quality of apps containing inaccurate counseling guidance and advice,” says Stephanie Benoit Kurtz, Lead Information Systems and Technology Faculty, School of Business and Information Technology at the University of Phoenix. “This, in combination with potentially poor security controls can spell trouble for young users that are searching for tools in the mental health space.”

For instance, take 7 Cups, one of the most popular online mental health platforms that uses both volunteer listeners and paid therapists to offer peer-to-peer and professional counseling. In face-to-face counseling settings, information shared by the client is strictly confidential and the counselor is not allowed to reveal these details to anyone unless required by law. But things are totally different online. The 7 Cups app not only shares personal user information with third parties, but also reviews personal chats between users and the peer listeners or the therapists. Moreover, apps like 7Cups, BlahTherapy, The Mighty, and HearMe rely on random internet users to offer support, so there’s zero guarantee any information you enter into the apps will be protected. These volunteer listeners or “peer counselors” as they’re called, can simply screenshot the chat and reveal the most intimate details young users have shared online in a desperate attempt to get some help.

These “non-clinical” mental health apps are neither subject to HIPAA guidelines nor are they governed by federal privacy restrictions or regulations,” Corbin says. “They also connect to social media platforms, and some privacy policies will allow data sharing back to the social media platform which can result in users seeing targeted advertisements on platforms like Facebook from data entered into the mental health app.” This can lead to harmful marketing and advertising tactics and trigger mental health issues for users who are just browsing social media for fun.

Do mental health apps really need all this data?

The biggest security concerns are found in the “free” to use mental health apps. There’s no such thing as a free lunch on the internet when it comes to mental health apps made by for-profit companies. If your app isn’t made by a non-profit like Sesame Workshop, Anxiety Canada or the US’s PTSD Coach, a third-party company might be paying for the data you offer up to your mental health app. “The free apps aren’t a charity and need to generate revenue in some manner so they are more likely to sell data to data brokers or other third parties,” Corbin says.

Some “clinical” or paid (or at least freemium) may feel a reduced need to sell sensitive information to third-party collectors. These apps often work directly with therapists and other licensed mental health professionals to offer virtual counseling that’s a close match to in-person therapy (including following official privacy regulations). The apps are directly generating revenue from the users, so there’s a limited need to sell users’ data for money. For instance, the AI chatbot and online therapy app Wysa doesn’t require an account to use the tools available, so no personal information is shared and collected. Another example is meditation apps like Let’s Meditate which don’t require you to create an account and enter any personal information to access the content.

Unfortunately, the number of apps offering decent privacy protection is limited since “the standards and requirements for the development of such apps have not kept up with the surge of apps on the market,” Kurtz says. “This is disappointing as accessing essential mental healthcare shouldn't come at the cost of exposing your sensitive information online.”

Sakshi Udavant

*Prywatność do nabycia osobno