Welcome to the brave new world of mental health AI chatbots. Woebot, started in 2017 by psychologists and AI-researchers from Stanford, describes itself as "a choose-your-own-adventure self-help book that is capable of storing all of your entries, and gets more specific to your needs over time." Welp, the future is here, for better or worse. What exactly does therapy by emotional AI algorithm look like? Users download Woebot and the chatbot starts asking questions. Things like "How are you feeling?" and "What's going on in your world right now?" Woebot then uses "natural language processing, psychological expertise, excellent writing, and sense of humor to create the experience of a friendly informative conversation."
Based on reviews we saw in the app stores, some people feel more comfortable talking to a bot than a real person. And others found long wait times or high costs kept them from talking to a real person, so this free chatbot was what they turned to instead. At least one study has shown AI chatbots can be successful in reducing anxiety and depression. Andinvestors dropped around $100 million in the company recently, so it seems AI chatbots are here to stay. What does the privacy of Woebot look like? We're pleased to say, in 2023, Woebot's privacy practices seem pretty good to us.
Was könnte passieren, wenn etwas schiefgeht?
First reviewed April 20, 2022. Review updated, April 25, 2023
When we first reviewed Woebot in 2022 we had some concerns about their privacy. However, after we published our review, Woebot reached out to us and open up a conversation to address our concerns. The result of those conversations were updates to their privacy policy that better clarify how they protect their users' privacy. So now here in 2023, we're happy to say, we feel pretty good about Woebot's privacy. This is exactly the change we love to see in the world. Thank you Woebot.
The biggest change we saw Woebot make to their privacy policy was to clarify that all users of their service have the same rights to access and delete their data. Their privacy policy now reads, "Anyone who uses the services can access, correct, or delete their personal data regardless of where they live or are physically located." This might seem like a small change, but ensuring all users, whether they live under strong privacy laws or not, have the same rights to access and delete data is a big deal to us here at *Privacy Not Included. We've actually shared Woebot's privacy policy with other companies when we asked them to clarify that all users have these rights, because we found Woebot's language so easy and simple to understand.
Over the past year Woebot says they also worked to simplify the language in their privacy policy. And in our review in 2023, we were happy to find it was clearer and easier to understand. This year they received none of our privacy or security dings, which is great. So, good work Woebot! We appreciate your willingness to listen, change, and work to protect and respect your users' privacy.
Read our 2022 review:
How good are AI chatbots -- like Woebot -- that you share all sorts of personal and emotional information with at protecting privacy? That's a very good question. One of the biggest risks with AI chatbots is keeping the information you share with them during your conversations secure. That means making sure no one else can read the contents of the conversations you have with the bot. But AI algorithms need to learn to get better at chatting with you. So when Woebot (or any AI chatbot) says they are "keeping previous chats in mind to provide the most beneficial and timely therapeutic suggestions," what does that mean? According to Woebot, that means they review de-identified portions of conversations and compare the AI-suggested path to the path chosen by the user to retrain their algorithms. Here's hoping those de-identified conversations are truly de-identified.
We do know Woebot says all your communications are encrypted both in transit and at rest, which is good. We don't know exactly how they "keep your previous chats in mind" to improve their therapy bot though, and that is a little worrisome. Something we, and other experts, alwaysworry about is racial, gender, and cultural bias making their way into AI algorithms. This would not be good for a therapy app. Does Woebot have a bias issue in their algorithm? We sure hope not. But we also can't tell. This isn't unique to Woebot though. We generally can't determine if there is bias in any proprietary AI algorithm. It's also good to remember that while your personal chats with a human therapist are covered by strict health privacy laws like HIPAA, your personal chats with an AI chatbot aren't always similarly protected. Woebot does say that they "treat all user data as Protected Health Information and adhere to all HIPAA and GDPR requirements."
How does Woebot's privacy policies look to us? We have a few concerns. Woebot says they can collect personal info like name, email, IP address, "inferences drawn from other personal information to create a profile about a consumer," and the information you give them in your conversations. They also say they can "obtain information about you from other sources, including through third party services and organizations to supplement information provided by you." So, Woebot can collect a good deal of personal information, add to the information you give them with even more information gathered from third parties. Then they say they can share some of this information with third parties, including insurance companies and a seemingly broad category they call "external advisors." They also say in their privacy policy they share some of your information, such as identifiers and network internet activity, with marketing partners for advertising purposes. We were a little confused by this because they also state in their privacy policy, "We never, ever sell or share your data with advertisers." Those two statements seem in conflict to us.
Finally Woebot says they aggregate or de-identified your personal information, including location and device information, and share it with third parties. This is a pretty common practice but we also must remind you that it has been found to be pretty easy to de-anonymize such data, especially if location data is included.
What's the the worst that could happen with Woebot? Hopefully nothing. But, what if the previous chats they keep in mind to provide you more beneficial therapeutic suggestions end up not being completely de-identified because you mentioned your dog HuskerDoodle in them and no one else has a dog named HuskerDoodle and that chat conversation gets leaked and the world knows all about your relationship with HuskerDoodle? OK, so this isn't likely to happen. Still, it's a good reminder that anything you share on the internet isn't 100% secure, that chats that are de-identified could, potentially, be re-identified under some circumstances, and Woebot is a for-profit company as well as your helpful mental health friend. Their own privacy policy states, "Unfortunately, no system is 100% secure, and we cannot ensure or warrant the security of any personal data you provide to us. To the fullest extent permitted by applicable law, we do not accept liability for unintentional disclosure." That's a good reminder to be careful out there folks.
Tipps zu Ihrem Schutz
- Do not log in using third-party accounts
- Do not connect to any third party via the app, or at least make sure that a third party employs decent privacy practices
- Do not give consent for sharing of personal data for marketing and advertisement.
- Choose a strong password! You may use a password control tool like 1Password, KeePass etc
- Do not use social media plug-ins.
- Use your device privacy controls to limit access to your personal information via app (do not give access to your camera, microphone, images, location unless neccessary)
- Keep your app regularly updated
- Limit ad tracking via your device (eg on iPhone go to Privacy -> Advertising -> Limit ad tracking) and biggest ad networks (for Google, go to Google account and turn off ad personalization)
- Request your data be deleted once you stop using the app. Simply deleting an app from your device usually does not erase your personal data.
- When starting a sign-up, do not agree to tracking of your data if possible.
Kann es mich ausspionieren?
Kamera
Gerät: Nicht verfügbar
App: Nein
Mikrofon
Gerät: Nicht verfügbar
App: Nein
Verfolgt den Standort
Gerät: Nicht verfügbar
App: Nein
Was kann zur Registrierung verwendet werden?
E-Mail-Adresse
Ja
Telefonnummer
Nein
Drittanbieter-Konto
Nein
Welche Daten sammelt das Unternehmen?
Persönliche
Email, birthday, name.
Körperbezogen
Soziale
Wie nutzt das Unternehmen die Daten?
Wie können Sie Ihre Daten kontrollieren?
Wie ist das Unternehmen in der Vergangenheit mit den Daten über seine Verbraucher umgegangen?
No known privacy or security incidents discovered in the last 3 years.
Informationen zum Datenschutz bei Kindern
Kann dieses Produkt offline genutzt werden?
Benutzerfreundliche Informationen zum Datenschutz?
Links zu Datenschutzinformationen
Erfüllt dieses Produkt unsere Mindestsicherheitsstandards?
Verschlüsselung
All data is encrypted both at rest with AES-256 or better and in transit with TLS 1.2 or better.
Sicheres Passwort
Woebot has added a strong password requirment of a minimum of 10 characters; 1 uppercase character; 1 lowercase character; 1 number; and 1 special character.
Sicherheits-Updates
Woebot has a scheduled monthly patching cycle.
Umgang mit Schwachstellen
Woebot says they respond to emergency vulnerabilities. They test the security of design by performing and remediating findings of penetration tests, vulnerability assessments, internal compliance reviews and more. To report a security vulnerability, Woebot says users can message them directly in the app, email [email protected], or use their contact form.
Datenschutzrichtlinie
Woebot says they are "keeping previous chats in mind to provide the most beneficial and timely therapeutic suggestions."
According to Woebot, this means they periodically review de-identified portions of conversations and compare the AI-suggested path to the path chosen by the user. When these paths diverge, they retrain their algorithms using the additional de-identified data to help Woebot’s conversational ability improve and learn.
Ist diese KI nicht vertrauenswürdig?
Welche Entscheidungen trifft die KI über Sie oder für Sie?
Gibt das Unternehmen transparent an, wie die KI funktioniert?
Hat der Benutzer die Kontrolle über die KI-Funktionen?
Tauchen Sie tiefer ein
-
Therapy by chatbot? The promise and challenges in using AI for mental healthNPR
-
Therapy Brought to You by Silicon ValleyThe San Fransisco Standard
-
The Chatbot Therapist Will See You NowWIRED
-
Privacy Concerns About Emotional ChatbotsInfosec
-
Something Bothering You? Tell It to Woebot.NY Times
-
I spent 2 weeks texting a bot about my anxiety — and found it to be surprisingly helpfulBusiness Insider
-
Pooling Mental Health Data with ChatbotsCambridge University Press
-
Dramatic growth in mental-health apps has created a risky industryThe Economist
-
Woebot – the bleeding intelligent self-help therapist and companionHarvard Business School Digital Initiative
-
Do Mental Health Chatbots Work?Healthline
-
The wellness industry’s risky embrace of AI-driven mental health careThe Brookings Institution
-
I actually Kind of Love My Chatbot TherapistLifehacker
-
Your AI Chatbot Therapist Isn’t Sure What It’s DoingGizmodo
-
Mental health chatbot Woebot gets $90m boostSilicon Republic
-
This mental health app wants to improve your moodCreative Bloq
-
Making Mental Health Radically Accessible: A Conversation with Allison Darcy, Founder and President of Woebot HealthAI fund
-
I bonded with a quirky robot after chatting to it about my fearsNew Scientist
-
Mental health chatbot Woebot could be adapted to tackle substance useMobiHealthNews
Kommentare
Möchten Sie einen Kommentar loswerden? Schreiben Sie uns.