Wysa describes itself as "an emotionally intelligent chatbot that uses AI to react to the emotions you express." Launched in 2016 by founder Jo Aggarwal, who discovered bots were easier to talk to when dealing with her own depression, the app says it offers "a mood tracker, mindfulness coach, anxiety helper, and mood-boosting buddy, all rolled into one." Along with Wysa's AI chatbot, the app also offers in-app text-based chat sessions with real, live therapists. Wysa offers some free access to their AI chatbot. To access more features, users can choose between two subscription options, a lower cost option with access to the tools, or a higher cost option for access to the tools and coaches. So, how does Wysa's privacy look? We're happy to say, Wysa's privacy looks pretty dang good! They seem to be one of the rare mental health apps that isn't looking to make money off your personal information. Good work Wysa!
Que pourrait-il se passer en cas de problème ?
First reviewed April 20, 2022. Review updated, April 25, 2023
Good news! One year later and Wysa is still really good at privacy! Wysa still doesn't track users for advertising, collect lots of personal information for targeted advertising purposes, share or sell lots of data with third parties, or generally be an awful company who doesn't respect privacy. Good work Wysa! Keep being a leader on privacy for all to see! Lord knows, we need many more companies like Wysa out there.
Oh, one more thing we love about Wysa. They have one of the best privacy policies we've ever read. Lots of clear, useful information, and a nice tracking of when they've made updates to the policy and what has changed. When we ask companies to write better privacy policies, this is one we hope they will look to for guidance.
Read our 2022 review:
There are so many bad-for-your-privacy mental health apps out there in the world it is depressing. Thank goodness Wysa doesn't seem to be one of them! They don't require any personal identifiers to use their service. They don't request your personal data. They don't share your personal data. They don't sell it either. What?!? It's so refreshing to see a mental health app with strong privacy practices. Good on you Wysa.
Wysa does say that no direct marketing is performed but that they may use social media or other marketing, but no personal data is shared for this purpose. Again, that is very good. Consumer Reports did find in 2021 that Wysa shared some data with Facebook, but from what we can tell, that is likely not personal information. And all those AI chatbot conversations, Wysa says they will never share that data without user consent.
Wysa says they can share aggregated, de-identified data for analytics and to help improve their AI. We're not too worried about this with Wysa as they say they don't process geolocation at a level that makes your data identifiable but we also feel compelled to mention it has been found to be relatively easy to de-identify such user data where location data is more precise.
All in all, Wysa seems to be a breath of fresh air in the mental health app space. It actually take steps to implement privacy and security by design and default. We absolutely love that here at *Privacy Not Included. Thank you Wysa and please, keep up the good work!
Conseils pour vous protéger
For extra security, you can use the app without registering. You can also use a fake name.
The app providers give the following security tips:
- Always lock your mobile screen by setting a password. Use strong passwords and keep passwords private. Never leave your device unattended.
- Always extend your mobile screen password to set an App PIN to keep your conversations with the App private.
- Always keep your mobile operating system up-to-date.
- Enable remote access of your devices to enable you to locate and control your devices remotely in the event your device gets stolen.
- Install anti-virus software to protect against virus attacks and infections
- Avoid phishing emails. Do not open files, click on links or download programs from an unknown source.
- Be wise about using Wi-Fi. Before you send personal and sensitive data over your laptop or mobile device on a public wireless network in a coffee shop, library, airport, hotel, or other public place, see if your data will be protected.
Ce produit peut-il m’espionner ?
Caméra
Appareil : Ne s’applique pas
Application : Non
Microphone
Appareil : Ne s’applique pas
Application : Oui
Piste la géolocalisation
Appareil : Ne s’applique pas
Application : Non
Que peut-on utiliser pour s’inscrire ?
Adresse e-mail
Oui
Téléphone
Non
Compte tiers
Non
Quelles données l’entreprise collecte-t-elle ?
Personnelles
Nickname, all other is optional
Corporelles
Wellness information (such as feelings, sentiment, mood, major life events, well-being assessments, coping ability, energy levels, objections)
Sociales
Comment l’entreprise utilise-t-elle les données ?
Comment pouvez-vous contrôler vos données ?
Quel est l’historique de l’entreprise en matière de protection des données des utilisateurs et utilisatrices ?
No known privacy or security incidents discovered in the last 3 years.
Informations liées à la vie privée des enfants
Ce produit peut-il être utilisé hors connexion ?
Informations relatives à la vie privée accessibles et compréhensibles ?
Detailed privacy policy with definitions, and privacy FAQ are offered by Wysa
Liens vers les informations concernant la vie privée
Ce produit respecte-t-il nos critères élémentaires de sécurité ?
Chiffrement
Wysa use TLS and SSL encryption during transfer and AES-256 protocol at rest.
Mot de passe robuste
Mises à jour de sécurité
Gestion des vulnérabilités
Politique de confidentialité
According to Wysa, "Wysa AI Coach is an artificial intelligence-based 'emotionally intelligent' service which responds to the emotions you express and uses evidence-based cognitive-behavioral techniques (CBT), DBT, meditation, breathing, yoga, motivational interviewing and micro-actions to help you build mental resilience skills and feel better."
The AI Coach will always check if it has understood you incorrectly before progressing.
Cette IA est-elle non digne de confiance ?
Quel genre de décisions l’IA prend-elle à votre sujet ou pour vous ?
L’entreprise est-elle transparente sur le fonctionnement de l’IA ?
Les fonctionnalités de l’IA peuvent-elles être contrôlées par l’utilisateur ou l’utilisatrice ?
Pour aller plus loin
-
Wysa Receives FDA Breakthrough Device Designation for AI-led Mental Health Conversational AgentBusiness Wire
-
Wysa raises $20 million to expand its therapist chatbot into a wider set of mental health servicesTechCrunch
-
Therapy by chatbot? The promise and challenges in using AI for mental healthNPR
-
Mental health app Wysa raises $5.5M for ’emotionally intelligent’ AITechCrunch
-
Wysa: Mental Health SupportCommon Sense Media
-
Do Mental Health Chatbots Work?Healthline
-
How Wysa App Helps People With Depression And Anxiety Lead A Stress-Free LifeTimes of India
-
Meet the Women Founders Behind Shine and Wysa, Two Apps Focused on Mental Health and Self-CareYahoo!
-
Wysa ReviewOne Mind Psyber Guide
-
I Chatted With a Therapy Bot to Ease My Covid Fears. It Was Bizarre.OneZero
-
Privacy Concerns About Emotional ChatbotsInfosec
-
Peace of Mind...Evaluating the Privacy Practices of Mental Health AppsConsumer Reports
-
The Digital Standard Case Study: Mental Health AppsThe Digital Standard
-
Mental Health Apps Aren't All As Private As You May ThinkConsumer Reports
-
WysaMental Health America
Commentaires
Vous avez un commentaire ? Dites-nous tout.