Welcome to the brave new world of mental health AI chatbots. Woebot, started in 2017 by psychologists and AI-researchers from Stanford, describes itself as "a choose-your-own-adventure self-help book that is capable of storing all of your entries, and gets more specific to your needs over time." Welp, the future is here, for better or worse. What exactly does therapy by emotional AI algorithm look like? Users download Woebot and the chatbot starts asking questions. Things like "How are you feeling?" and "What's going on in your world right now?" Woebot then uses "natural language processing, psychological expertise, excellent writing, and sense of humor to create the experience of a friendly informative conversation."
Based on reviews we saw in the app stores, some people feel more comfortable talking to a bot than a real person. And others found long wait times or high costs kept them from talking to a real person, so this free chatbot was what they turned to instead. At least one study has shown AI chatbots can be successful in reducing anxiety and depression. Andinvestors dropped around $100 million in the company recently, so it seems AI chatbots are here to stay. What does the privacy of Woebot look like? We're pleased to say, in 2023, Woebot's privacy practices seem pretty good to us.
Wat kan er gebeuren als er iets misgaat?
First reviewed April 20, 2022. Review updated, April 25, 2023
Read our 2022 review:
How good are AI chatbots -- like Woebot -- that you share all sorts of personal and emotional information with at protecting privacy? That's a very good question. One of the biggest risks with AI chatbots is keeping the information you share with them during your conversations secure. That means making sure no one else can read the contents of the conversations you have with the bot. But AI algorithms need to learn to get better at chatting with you. So when Woebot (or any AI chatbot) says they are "keeping previous chats in mind to provide the most beneficial and timely therapeutic suggestions," what does that mean? According to Woebot, that means they review de-identified portions of conversations and compare the AI-suggested path to the path chosen by the user to retrain their algorithms. Here's hoping those de-identified conversations are truly de-identified.
We do know Woebot says all your communications are encrypted both in transit and at rest, which is good. We don't know exactly how they "keep your previous chats in mind" to improve their therapy bot though, and that is a little worrisome. Something we, and other experts, alwaysworry about is racial, gender, and cultural bias making their way into AI algorithms. This would not be good for a therapy app. Does Woebot have a bias issue in their algorithm? We sure hope not. But we also can't tell. This isn't unique to Woebot though. We generally can't determine if there is bias in any proprietary AI algorithm. It's also good to remember that while your personal chats with a human therapist are covered by strict health privacy laws like HIPAA, your personal chats with an AI chatbot aren't always similarly protected. Woebot does say that they "treat all user data as Protected Health Information and adhere to all HIPAA and GDPR requirements."
Finally Woebot says they aggregate or de-identified your personal information, including location and device information, and share it with third parties. This is a pretty common practice but we also must remind you that it has been found to be pretty easy to de-anonymize such data, especially if location data is included.
Tips om uzelf te beschermen
- Do not log in using third-party accounts
- Do not connect to any third party via the app, or at least make sure that a third party employs decent privacy practices
- Do not give consent for sharing of personal data for marketing and advertisement.
- Choose a strong password! You may use a password control tool like 1Password, KeePass etc
- Do not use social media plug-ins.
- Use your device privacy controls to limit access to your personal information via app (do not give access to your camera, microphone, images, location unless neccessary)
- Keep your app regularly updated
- Limit ad tracking via your device (eg on iPhone go to Privacy -> Advertising -> Limit ad tracking) and biggest ad networks (for Google, go to Google account and turn off ad personalization)
- Request your data be deleted once you stop using the app. Simply deleting an app from your device usually does not erase your personal data.
- When starting a sign-up, do not agree to tracking of your data if possible.
Apparaat: Niet beschikbaar
Apparaat: Niet beschikbaar
Apparaat: Niet beschikbaar
Wat is er nodig om u aan te melden?
Account van derden
Welke gegevens verzamelt het bedrijf?
Email, birthday, name.
Hoe gebruikt het bedrijf deze gegevens?
Hoe kunt u uw gegevens beheren?
Hoe staat het bedrijf bekend als het gaat om het beschermen van gebruikersgegevens?
No known privacy or security incidents discovered in the last 3 years.
Privacyinformatie voor kinderen
Kan dit product offline worden gebruikt?
Koppelingen naar privacy-informatie
Voldoet dit product aan onze minimale beveiligingsnormen?
All data is encrypted both at rest with AES-256 or better and in transit with TLS 1.2 or better.
Woebot has added a strong password requirment of a minimum of 10 characters; 1 uppercase character; 1 lowercase character; 1 number; and 1 special character.
Woebot has a scheduled monthly patching cycle.
Woebot says they respond to emergency vulnerabilities. They test the security of design by performing and remediating findings of penetration tests, vulnerability assessments, internal compliance reviews and more. To report a security vulnerability, Woebot says users can message them directly in the app, email [email protected], or use their contact form.
Woebot says they are "keeping previous chats in mind to provide the most beneficial and timely therapeutic suggestions."
According to Woebot, this means they periodically review de-identified portions of conversations and compare the AI-suggested path to the path chosen by the user. When these paths diverge, they retrain their algorithms using the additional de-identified data to help Woebot’s conversational ability improve and learn.
Is deze AI onbetrouwbaar?
Wat voor soort beslissingen neemt de AI over u of voor u?
Is het bedrijf transparant over hoe de AI werkt?
Heeft de gebruiker controle over de AI-functies?
Therapy by chatbot? The promise and challenges in using AI for mental healthNPR
Therapy Brought to You by Silicon ValleyThe San Fransisco Standard
The Chatbot Therapist Will See You NowWIRED
Privacy Concerns About Emotional ChatbotsInfosec
Something Bothering You? Tell It to Woebot.NY Times
I spent 2 weeks texting a bot about my anxiety — and found it to be surprisingly helpfulBusiness Insider
Pooling Mental Health Data with ChatbotsCambridge University Press
Dramatic growth in mental-health apps has created a risky industryThe Economist
Woebot – the bleeding intelligent self-help therapist and companionHarvard Business School Digital Initiative
Do Mental Health Chatbots Work?Healthline
The wellness industry’s risky embrace of AI-driven mental health careThe Brookings Institution
I actually Kind of Love My Chatbot TherapistLifehacker
Your AI Chatbot Therapist Isn’t Sure What It’s DoingGizmodo
Mental health chatbot Woebot gets $90m boostSilicon Republic
This mental health app wants to improve your moodCreative Bloq
Making Mental Health Radically Accessible: A Conversation with Allison Darcy, Founder and President of Woebot HealthAI fund
I bonded with a quirky robot after chatting to it about my fearsNew Scientist
Mental health chatbot Woebot could be adapted to tackle substance useMobiHealthNews
Hebt u een opmerking? Laat het ons weten.