Youper is a digital mental health service "powered by Artificial Intelligence." It offers online therapy, behavioral coaches, medication services, and an AI chatbot. Youper really touts their combination of psychology and technology (especially AI) for their mental health solutions. How does Youper handle their users' privacy? Well, they just might be the most improved app for privacy we saw over the past year.
What could happen if something goes wrong?
First reviewed April 20, 2022. Review updated, April 25, 2023
Read our 2022 review:
Yikes Youper! When you say you're "the only mental health service that understands you," it seems maybe you mean that literally.
Not only does Youper collect a ton of personal information, they say they can share this personal information with a number of subsidiaries, affiliates, and third parties, including for marketing purposes. Now, not all that personal information can be used or shared with third parties for things like marketing purposes -- they say personal health information won't be shared with third parties for marketing purposes without your consent, for example. But a lot of that information can be used and shared with third parties for things like targeted advertising and that's what worries us.
That's a whole lot of personal information Youper collects that you really have to hope they keeps secure. Just a reminder, they say on their own privacy page, "Unfortunately, the transmission of information via the internet is not completely secure. Although we will do our best to protect your personal information, we cannot guarantee the security of your data transmitted to our Website and App; any transmission is at your own risk." This is a good reminder, what you share on the internet is never, ever 100% safe and secure.
Is there more that worries us? Yeah, there's more. Youper says they "may disclose aggregated, de-identified information about our users, and information that does not identify any individual, without restriction." Here's where we remind you that such de-identified data has been found to be relatively easy to re-identify, especially if location data is included. And Youper doesn't meet our Minimum Security Standards as we were able to login to the app with the weak password "111111". Requiring a strong password on an app that collects and stores so much personal information seems like an easy way to show you care about protecting the privacy of yours users. Too bad Youper doesn't do that.
So yeah, Youper's huge personal data collection and rather scary privacy and security practices leave us thinking there might be better apps to help you manage your anxiety and depression. What's the worst that could happen? Uhg, we don't even want to go there. It would really suck if your personal profile leaked and went up for sale on the dark web where a sketchy group of bad guys accessed it and used it to make your life miserable by outing your mental health problems to the world. Nobody needs that.
Tips to protect yourself
- Do not log in using third-party accounts
- Do not give consent for sharing of personal data for marketing and advertisement.
- Choose a strong password! You may use a password control tool like 1Password, KeePass etc
- Do not use social media plug-ins.
- Use your device privacy controls to limit access to your personal information via app (do not give access to your camera, microphone, images, location unless neccessary)
- Keep your app regularly updated
- Limit ad tracking via your device (e.g. on iPhone go to Privacy -> Advertising -> Limit ad tracking) and biggest ad networks (for Google, go to Google account and turn off ad personalization)
- Request your data be deleted once you stop using the app. Simply deleting an app from your device usually does not erase your personal data.
- When starting a sign-up, do not agree to tracking of your data if possible.
What can be used to sign up?
Apple and Google sign-ups are possible
What data does the company collect?
Name, contact information, education, employment, employment history, financial information, phone number, email, location
How does the company use this data?
How can you control your data?
What is the company’s known track record of protecting users’ data?
No known privacy or security incidents discovered in the last 3 years.
Child Privacy Information
Can this product be used offline?
User-friendly privacy information?
Links to privacy information
Does this product meet our Minimum Security Standards?
Youper update their password requirement to require a strong password after we reached out to them. Thank you Youper.
Youper says "Youper AI Assistant is based on therapy and meditation. Through quick and insightful conversations, Youper helps you master life's ups and downs."
Is this AI untrustworthy?
What kind of decisions does the AI make about you or for you?
Suggestions and meditative audios.
Is the company transparent about how the AI works?
Does the user have control over the AI features?
AI in healthcare – American consumers reveal willingness to use but privacy concerns remainYouGov
Therapy Brought to You by Silicon ValleyThe San Fransisco Standard
Peace of Mind...Evaluating the Privacy Practices of Mental Health AppsConsumer Reports
The Digital Standard Case Study: Mental Health AppsThe Digital Standard
Mental Health Apps Aren't All As Private As You May ThinkConsumer Reports
Global Mental Health Apps Market Size, Share & Industry Trends Analysis Report By Application, By Platform Type, By Regional Outlook and Forecast, 2021 - 2027Yahoo! finance
Youper App Review 2022: Pros & Cons, Cost, & Who It’s Right ForChoosing Therapy
Youper, a chatbot that helps users navigate their emotions, raises $3 million in seed fundingTechCrunch
Using AI Chatbots for mental health care - an expert opinionAugmented Mental Health
The 6 Best Apps for Depression in 2022Psych Central
The Best Depression AppsHealthline
Youper AI ReviewOne Mind Psyber Guide
The Therapy-App FantasyThe Cut
Privacy Concerns About Emotional ChatbotsInfosec
Self-care apps want to make us happy. So why do they make us feel so bleak?Mashable
This App Uses AI Chats To Provide Mental Health Support — Here’s How It WorksBustle
Acceptability and Effectiveness of Artificial Intelligence Therapy for Anxiety and Depression (Youper): Longitudinal Observational StudyNational Center for Biotechnology Information
Got a comment? Let us hear it.