Wysa describes itself as "an emotionally intelligent chatbot that uses AI to react to the emotions you express." Launched in 2016 by founder Jo Aggarwal, who discovered bots were easier to talk to when dealing with her own depression, the app says it offers "a mood tracker, mindfulness coach, anxiety helper, and mood-boosting buddy, all rolled into one." Along with Wysa's AI chatbot, the app also offers in-app text-based chat sessions with real, live therapists. Wysa offers some free access to their AI chatbot. To access more features, users can choose between two subscription options, $99 per year for access to the tools, or $99 per month for access to the tools and coaches.
So, how does Wysa's privacy look? We're so happy to say, Wysa's privacy looks pretty dang good! They seem to be one of the rare mental health apps that isn't looking to make money off your personal information. Good work Wysa!
What could happen if something goes wrong?
There are so many bad-for-your-privacy mental health apps out there in the world it is depressing. Thank goodness Wysa doesn't seem to be one of them! They don't require any personal identifiers to use their service. They don't request your personal data. They don't share your personal data. They don't sell it either. What?!? It's so refreshing to see a mental health app with strong privacy practices. Good on you Wysa.
Wysa does say that no direct marketing is performed but that they may use social media or other marketing, but no personal data is shared for this purpose. Again, that is very good. Consumer Reports did find in 2021 that Wysa shared some data with Facebook, but from what we can tell, that is likely not personal information. And all those AI chatbot conversations, Wysa says they will never share that data without user consent.
Wysa says they can share aggregated, de-identified data for analytics and to help improve their AI. We're not too worried about this with Wysa as they say they don't process geolocation at a level that makes your data identifiable but we also feel compelled to mention it has been found to be relatively easy to de-identify such user data where location data is more precise.
All in all, Wysa seems to be a breath of fresh air in the mental health app space. It actually take steps to implement privacy and security by design and default. We absolutely love that here at *Privacy Not Included. Thank you Wysa and please, keep up the good work!
Tips to protect yourself
- For extra security, you can use the app without registering.
- Wysa provides the following security tips:
- Always lock your mobile screen by setting a password. Use strong passwords and keep passwords private. Never leave your device unattended.
- Always extend your mobile screen password to set an App PIN to keep your conversations with the App private.
- Always keep your mobile operating system up-to-date.
- Enable remote access of your devices to enable you to locate and control your devices remotely in the event your device gets stolen.
- Install anti-virus software to protect against virus attacks and infections
- Avoid phishing emails. Do not open files, click on links or download programs from an unknown source.
- Be wise about using Wi-Fi. Before you send personal and sensitive data over your laptop or mobile device on a public wireless network in a coffee shop, library, airport, hotel, or other public place, see if your data will be protected.
What can be used to sign up?
What data does the company collect?
Nickname, all other is optional
Wellness information (such as feelings, sentiment, mood, major life events, well-being assessments, coping ability, energy levels, objections)
How does the company use this data?
How can you control your data?
What is the company’s known track record of protecting users’ data?
No known privacy or security incidents discovered in the last 3 years.
Child Privacy Information
Can this product be used offline?
User-friendly privacy information?
Wysa provides an extensive privacy FAQ.
Links to privacy information
Does this product meet our Minimum Security Standards?
Wysa use TLS and SSL encryption during transfer and AES-256 protocol at rest.
According to Wysa, "Wysa AI Coach is an artificial intelligence-based 'emotionally intelligent' service which responds to the emotions you express and uses evidence-based cognitive-behavioral techniques (CBT), DBT, meditation, breathing, yoga, motivational interviewing and micro-actions to help you build mental resilience skills and feel better."
The AI Coach will always check if it has understood you incorrectly before progressing.
Is this AI untrustworthy?
What kind of decisions does the AI make about you or for you?
Wysa decides what kind of help/content to provide you.
Is the company transparent about how the AI works?
Does the user have control over the AI features?
Got a comment? Let us hear it.