Youper is a digital mental health service "powered by Artificial Intelligence." It offers online therapy, behavioral coaches, medication services, and an AI chatbot. Youper really touts their combination of psychology and technology (especially AI) for their mental health solutions. How does Youper handle their users' privacy? Well, they just might be the most improved app for privacy we saw over the past year.
Que pourrait-il se passer en cas de problème ?
First reviewed April 20, 2022. Review updated, April 25, 2023
In 2022, Youper's privacy policy raised a bunch of eyebrows for us. In 2023, things look much better. They seem to share much less data. In fact, according to their privacy policy, they seem to share almost no data with third parties (we hope!). Yay for getting better in 2023 Youper. And more good news from Youper, they updated their password requirement to require a strong password after we reached out them in 2023. Youper is doing much better this year indeed, and might just be our most improved app overall.
One thing we should mention we'd love to see Youper do better though. They drop new people into a questionnaire asking sensitive personal questions before giving users an opportunity to review a privacy policy and see how the answers to these questions might be used or protected. That's a practice we really don't like, and one too many mental health app companies seem to use.
Read our 2022 review:
Yikes Youper! When you say you're "the only mental health service that understands you," it seems maybe you mean that literally.
Youper's privacy policy says they can collect a whole lot of personal information. Everything from name, address, email, birth date, race, religion, sexual orientation and political beliefs, to "Information stored on your mobile device, including in other applications" and even real-time information about the location of your devices. Heck, when you download the Youper app, it asks for permission to access your wearable sensor data (like heart rate monitors). They even say they "will combine information we receive from other sources with information you give to us and information we collect about you." Yikes again!
Not only does Youper collect a ton of personal information, they say they can share this personal information with a number of subsidiaries, affiliates, and third parties, including for marketing purposes. Now, not all that personal information can be used or shared with third parties for things like marketing purposes -- they say personal health information won't be shared with third parties for marketing purposes without your consent, for example. But a lot of that information can be used and shared with third parties for things like targeted advertising and that's what worries us.
That's a whole lot of personal information Youper collects that you really have to hope they keeps secure. Just a reminder, they say on their own privacy page, "Unfortunately, the transmission of information via the internet is not completely secure. Although we will do our best to protect your personal information, we cannot guarantee the security of your data transmitted to our Website and App; any transmission is at your own risk." This is a good reminder, what you share on the internet is never, ever 100% safe and secure.
Is there more that worries us? Yeah, there's more. Youper says they "may disclose aggregated, de-identified information about our users, and information that does not identify any individual, without restriction." Here's where we remind you that such de-identified data has been found to be relatively easy to re-identify, especially if location data is included. And Youper doesn't meet our Minimum Security Standards as we were able to login to the app with the weak password "111111". Requiring a strong password on an app that collects and stores so much personal information seems like an easy way to show you care about protecting the privacy of yours users. Too bad Youper doesn't do that.
So yeah, Youper's huge personal data collection and rather scary privacy and security practices leave us thinking there might be better apps to help you manage your anxiety and depression. What's the worst that could happen? Uhg, we don't even want to go there. It would really suck if your personal profile leaked and went up for sale on the dark web where a sketchy group of bad guys accessed it and used it to make your life miserable by outing your mental health problems to the world. Nobody needs that.
Conseils pour vous protéger
- Do not log in using third-party accounts
- Do not give consent for sharing of personal data for marketing and advertisement.
- Choose a strong password! You may use a password control tool like 1Password, KeePass etc
- Do not use social media plug-ins.
- Use your device privacy controls to limit access to your personal information via app (do not give access to your camera, microphone, images, location unless neccessary)
- Keep your app regularly updated
- Limit ad tracking via your device (e.g. on iPhone go to Privacy -> Advertising -> Limit ad tracking) and biggest ad networks (for Google, go to Google account and turn off ad personalization)
- Request your data be deleted once you stop using the app. Simply deleting an app from your device usually does not erase your personal data.
- When starting a sign-up, do not agree to tracking of your data if possible.
Ce produit peut-il m’espionner ?
Caméra
Appareil : Ne s’applique pas
Application : Oui
Microphone
Appareil : Ne s’applique pas
Application : Non
Piste la géolocalisation
Appareil : Ne s’applique pas
Application : Oui
Que peut-on utiliser pour s’inscrire ?
Adresse e-mail
Oui
Téléphone
Non
Compte tiers
Oui
Apple and Google sign-ups are possible
Quelles données l’entreprise collecte-t-elle ?
Personnelles
Name, contact information, education, employment, employment history, financial information, phone number, email, location
Corporelles
Health data
Sociales
Comment l’entreprise utilise-t-elle les données ?
Comment pouvez-vous contrôler vos données ?
Quel est l’historique de l’entreprise en matière de protection des données des utilisateurs et utilisatrices ?
No known privacy or security incidents discovered in the last 3 years.
Informations liées à la vie privée des enfants
Ce produit peut-il être utilisé hors connexion ?
Informations relatives à la vie privée accessibles et compréhensibles ?
Liens vers les informations concernant la vie privée
Ce produit respecte-t-il nos critères élémentaires de sécurité ?
Chiffrement
Mot de passe robuste
Youper update their password requirement to require a strong password after we reached out to them. Thank you Youper.
Mises à jour de sécurité
Gestion des vulnérabilités
Politique de confidentialité
Youper says "Youper AI Assistant is based on therapy and meditation. Through quick and insightful conversations, Youper helps you master life's ups and downs."
Cette IA est-elle non digne de confiance ?
Quel genre de décisions l’IA prend-elle à votre sujet ou pour vous ?
L’entreprise est-elle transparente sur le fonctionnement de l’IA ?
Les fonctionnalités de l’IA peuvent-elles être contrôlées par l’utilisateur ou l’utilisatrice ?
Pour aller plus loin
-
AI in healthcare – American consumers reveal willingness to use but privacy concerns remainYouGov
-
Therapy Brought to You by Silicon ValleyThe San Fransisco Standard
-
Peace of Mind...Evaluating the Privacy Practices of Mental Health AppsConsumer Reports
-
The Digital Standard Case Study: Mental Health AppsThe Digital Standard
-
Mental Health Apps Aren't All As Private As You May ThinkConsumer Reports
-
Global Mental Health Apps Market Size, Share & Industry Trends Analysis Report By Application, By Platform Type, By Regional Outlook and Forecast, 2021 - 2027Yahoo! finance
-
Youper App Review 2022: Pros & Cons, Cost, & Who It’s Right ForChoosing Therapy
-
Youper, a chatbot that helps users navigate their emotions, raises $3 million in seed fundingTechCrunch
-
Using AI Chatbots for mental health care - an expert opinionAugmented Mental Health
-
The 6 Best Apps for Depression in 2022Psych Central
-
The Best Depression AppsHealthline
-
Youper AI ReviewOne Mind Psyber Guide
-
The Therapy-App FantasyThe Cut
-
Privacy Concerns About Emotional ChatbotsInfosec
-
Self-care apps want to make us happy. So why do they make us feel so bleak?Mashable
-
This App Uses AI Chats To Provide Mental Health Support — Here’s How It WorksBustle
-
Acceptability and Effectiveness of Artificial Intelligence Therapy for Anxiety and Depression (Youper): Longitudinal Observational StudyNational Center for Biotechnology Information
Commentaires
Vous avez un commentaire ? Dites-nous tout.