Wysa describes itself as "an emotionally intelligent chatbot that uses AI to react to the emotions you express." Launched in 2016 by founder Jo Aggarwal, who discovered bots were easier to talk to when dealing with her own depression, the app says it offers "a mood tracker, mindfulness coach, anxiety helper, and mood-boosting buddy, all rolled into one." Along with Wysa's AI chatbot, the app also offers in-app text-based chat sessions with real, live therapists. Wysa offers some free access to their AI chatbot. To access more features, users can choose between two subscription options, a lower cost option with access to the tools, or a higher cost option for access to the tools and coaches. So, how does Wysa's privacy look? We're happy to say, Wysa's privacy looks pretty dang good! They seem to be one of the rare mental health apps that isn't looking to make money off your personal information. Good work Wysa!
What could happen if something goes wrong?
First reviewed April 20, 2022. Review updated, April 25, 2023
Good news! One year later and Wysa is still really good at privacy! Wysa still doesn't track users for advertising, collect lots of personal information for targeted advertising purposes, share or sell lots of data with third parties, or generally be an awful company who doesn't respect privacy. Good work Wysa! Keep being a leader on privacy for all to see! Lord knows, we need many more companies like Wysa out there.
Oh, one more thing we love about Wysa. They have one of the best privacy policies we've ever read. Lots of clear, useful information, and a nice tracking of when they've made updates to the policy and what has changed. When we ask companies to write better privacy policies, this is one we hope they will look to for guidance.
Read our 2022 review:
There are so many bad-for-your-privacy mental health apps out there in the world it is depressing. Thank goodness Wysa doesn't seem to be one of them! They don't require any personal identifiers to use their service. They don't request your personal data. They don't share your personal data. They don't sell it either. What?!? It's so refreshing to see a mental health app with strong privacy practices. Good on you Wysa.
Wysa does say that no direct marketing is performed but that they may use social media or other marketing, but no personal data is shared for this purpose. Again, that is very good. Consumer Reports did find in 2021 that Wysa shared some data with Facebook, but from what we can tell, that is likely not personal information. And all those AI chatbot conversations, Wysa says they will never share that data without user consent.
Wysa says they can share aggregated, de-identified data for analytics and to help improve their AI. We're not too worried about this with Wysa as they say they don't process geolocation at a level that makes your data identifiable but we also feel compelled to mention it has been found to be relatively easy to de-identify such user data where location data is more precise.
All in all, Wysa seems to be a breath of fresh air in the mental health app space. It actually take steps to implement privacy and security by design and default. We absolutely love that here at *Privacy Not Included. Thank you Wysa and please, keep up the good work!
Tips to protect yourself
For extra security, you can use the app without registering. You can also use a fake name.
The app providers give the following security tips:
- Always lock your mobile screen by setting a password. Use strong passwords and keep passwords private. Never leave your device unattended.
- Always extend your mobile screen password to set an App PIN to keep your conversations with the App private.
- Always keep your mobile operating system up-to-date.
- Enable remote access of your devices to enable you to locate and control your devices remotely in the event your device gets stolen.
- Install anti-virus software to protect against virus attacks and infections
- Avoid phishing emails. Do not open files, click on links or download programs from an unknown source.
- Be wise about using Wi-Fi. Before you send personal and sensitive data over your laptop or mobile device on a public wireless network in a coffee shop, library, airport, hotel, or other public place, see if your data will be protected.
Can it snoop on me?
Camera
Device: N/A
App: No
Microphone
Device: N/A
App: Yes
Tracks location
Device: N/A
App: No
What can be used to sign up?
Yes
Phone
No
Third-party account
No
What data does the company collect?
Personal
Nickname, all other is optional
Body related
Wellness information (such as feelings, sentiment, mood, major life events, well-being assessments, coping ability, energy levels, objections)
Social
How does the company use this data?
How can you control your data?
What is the company’s known track record of protecting users’ data?
No known privacy or security incidents discovered in the last 3 years.
Child Privacy Information
Can this product be used offline?
User-friendly privacy information?
Detailed privacy policy with definitions, and privacy FAQ are offered by Wysa
Links to privacy information
Does this product meet our Minimum Security Standards?
Encryption
Wysa use TLS and SSL encryption during transfer and AES-256 protocol at rest.
Strong password
Security updates
Manages vulnerabilities
Privacy policy
According to Wysa, "Wysa AI Coach is an artificial intelligence-based 'emotionally intelligent' service which responds to the emotions you express and uses evidence-based cognitive-behavioral techniques (CBT), DBT, meditation, breathing, yoga, motivational interviewing and micro-actions to help you build mental resilience skills and feel better."
The AI Coach will always check if it has understood you incorrectly before progressing.
Is this AI untrustworthy?
What kind of decisions does the AI make about you or for you?
Is the company transparent about how the AI works?
Does the user have control over the AI features?
Dive Deeper
-
Wysa Receives FDA Breakthrough Device Designation for AI-led Mental Health Conversational AgentBusiness Wire
-
Wysa raises $20 million to expand its therapist chatbot into a wider set of mental health servicesTechCrunch
-
Therapy by chatbot? The promise and challenges in using AI for mental healthNPR
-
Mental health app Wysa raises $5.5M for ’emotionally intelligent’ AITechCrunch
-
Wysa: Mental Health SupportCommon Sense Media
-
Do Mental Health Chatbots Work?Healthline
-
How Wysa App Helps People With Depression And Anxiety Lead A Stress-Free LifeTimes of India
-
Meet the Women Founders Behind Shine and Wysa, Two Apps Focused on Mental Health and Self-CareYahoo!
-
Wysa ReviewOne Mind Psyber Guide
-
I Chatted With a Therapy Bot to Ease My Covid Fears. It Was Bizarre.OneZero
-
Privacy Concerns About Emotional ChatbotsInfosec
-
Peace of Mind...Evaluating the Privacy Practices of Mental Health AppsConsumer Reports
-
The Digital Standard Case Study: Mental Health AppsThe Digital Standard
-
Mental Health Apps Aren't All As Private As You May ThinkConsumer Reports
-
WysaMental Health America
Comments
Got a comment? Let us hear it.