
Welcome to the brave new world of mental health AI chatbots. Woebot, started in 2017 by psychologists and AI-researchers from Stanford, describes itself as "a choose-your-own-adventure self-help book that is capable of storing all of your entries, and gets more specific to your needs over time." Welp, the future is here, for better or worse. What exactly does therapy by emotional AI algorithm look like? Users download Woebot and the chatbot starts asking questions. Things like "How are you feeling?" and "What's going on in your world right now?" Woebot then uses "natural language processing, psychological expertise, excellent writing, and sense of humor to create the experience of a friendly informative conversation."
Based on reviews we saw in the app stores, some people feel more comfortable talking to a bot than a real person. And others found long wait times or high costs kept them from talking to a real person, so this free chatbot was what they turned to instead. At least one study has shown AI chatbots can be successful in reducing anxiety and depression. And investors dropped around $100 million in the company recently, so it seems AI chatbots are here to stay. What does the privacy of Woebot look like? Well, when we first reviewed Woebot we had some concerns. However, after we published our review, Woebot updated their privacy policy and made some changes to better clarify how they protect their users' privacy. We still have some concerns, but with their updated privacy policy, not as many as before.
O que pode acontecer se algo der errado?
How good are AI chatbots -- like Woebot--that you share all sorts of personal and emotional information with at protecting privacy? That's a very good question. One of the biggest risks with AI chatbots is keeping the information you share with them during your conversations secure. That means making sure no one else can read the contents of the conversations you have with the bot. But AI algorithms need to learn to get better at chatting with you. So when Woebot (or any AI chatbot) says they are "keeping previous chats in mind to provide the most beneficial and timely therapeutic suggestions," what does that mean? According to Woebot, that means they review de-identified portions of conversations and compare the AI-suggested path to the path chosen by the user to retrain their algorithms. Here's hoping those de-identified conversations are truly de-identified.
We do know Woebot says all your communications are encrypted both in transit and at rest, which is good. Something we, and other experts, always worry about is racial, gender, and cultural bias making their way into AI algorithms. This would not be good for a therapy app. Does Woebot have a bias issue in their algorithm? We sure hope not. But we also can't tell. This isn't unique to Woebot though. We generally can't determine if there is bias in any proprietary AI algorithm. It's also good to remember that while your personal chats with a human therapist are covered by strict health privacy laws like HIPAA, your personal chats with an AI chatbot aren't always similarly protected. Woebot does say that they "treat all user data as Protected Health Information and adhere to all HIPAA and GDPR requirements."
How does Woebot's privacy policies look to us? We have a few concerns. Woebot says they can collect personal info like name, email, IP address, "inferences drawn from other personal information to create a profile about a consumer," and the information you give them in your conversations. They also say they can "obtain information about you from other sources, including through third party services and organizations to supplement information provided by you." So, Woebot can collect a good deal of personal information, add to the information you give them with even more information gathered from third parties. Then they say they can share some of this information with third parties, including insurance companies and a seemingly broad category they call "external advisors." They also say in their privacy policy they share some of your information, such as identifiers and network internet activity, with marketing partners for advertising purposes. We were a little confused by this because they also state in their privacy policy, "We never, ever sell or share your data with advertisers." Those two statements seem in conflict to us.
Finally, Woebot says they aggregate or de-identified your personal information, including location and device information, and share it with third parties. This is a pretty common practice but we also must remind you that it has been found to be pretty easy to de-anonymize such data, especially if location data is included.
What's the the worst that could happen with Woebot? Hopefully nothing. But, what if the previous chats they keep in mind to provide you more beneficial therapeutic suggestions end up not being completely de-identified because you mentioned your dog HuskerDoodle in them and no one else has a dog named HuskerDoodle and that chat conversation gets leaked and the world knows all about your relationship with HuskerDoodle? OK, so this isn't likely to happen. Still, it's a good reminder that anything you share on the internet isn't 100% secure, that chats that are de-identified could, potentially, be re-identified under some circumstances, and Woebot is a for-profit company as well as your helpful mental health friend. Their own privacy policy states, "Unfortunately, no system is 100% secure, and we cannot ensure or warrant the security of any personal data you provide to us. To the fullest extent permitted by applicable law, we do not accept liability for unintentional disclosure." That's a good reminder to be careful out there folks.
Dicas para se proteger
- Don't connect your app to any social networks like Facebook.
- Don't allow the app access your location.
Pode me bisbilhotar?
Câmera
Dispositivo: Não aplicável
Aplicativo: Não
Microfone
Dispositivo: Não aplicável
Aplicativo: Não
Rastreia localização
Dispositivo: Não aplicável
Aplicativo: Não
O que pode ser usado para se inscrever?
Sim
Celular
Não
Conta de terceiros
Não
Que dados a empresa coleta?
Pessoal
Name, email.
Relacionado ao corpo
Responses to treatment and satisfaction surveys.
Social
Como a empresa usa esses dados?
Como você pode controlar seus dados?
Qual é o histórico conhecido da empresa na proteção de dados dos usuários?
No known privacy or security incidents discovered in the last 3 years.
Informações de privacidade infantil
Este produto pode ser usado offline?
Informações de privacidade fáceis de entender?
Links para informações de privacidade
Este produto atende aos nossos padrões mínimos de segurança?
Criptografia
All data is encrypted both at rest with AES-256 or better and in transit with TLS 1.2 or better.
Senha forte
Atualizações de segurança
Woebot has a scheduled monthly patching cycle.
Gerencia vulnerabilidades
Woebot says they respond to emergency vulnerabilities. They test the security of design by performing and remediating findings of penetration tests, vulnerability assessments, internal compliance reviews and more. To report a security vulnerability, Woebot says users can message them directly in the app, email [email protected], or use their contact form.
Política de privacidade
Woebot says they are "keeping previous chats in mind to provide the most beneficial and timely therapeutic suggestions."
According to Woebot, this means they periodically review de-identified portions of conversations and compare the AI-suggested path to the path chosen by the user. When these paths diverge, they retrain their algorithms using the additional de-identified data to help Woebot’s conversational ability improve and learn.
Esta inteligência artificial não é confiável?
Que tipo de decisões a inteligência artificial faz sobre você ou por você?
Therapeutic suggestions
A empresa é transparente sobre como funciona a inteligência artificial?
O usuário tem controle sobre os recursos da inteligência artificial?
Mergulhe mais fundo
-
The Chatbot Therapist Will See You NowWIRED
-
Privacy Concerns About Emotional ChatbotsInfosec
-
Something Bothering You? Tell It to Woebot.NY Times
-
I spent 2 weeks texting a bot about my anxiety — and found it to be surprisingly helpfulBusiness Insider
-
Pooling Mental Health Data with ChatbotsCambridge University Press
-
Dramatic growth in mental-health apps has created a risky industryThe Economist
-
Woebot – the bleeding intelligent self-help therapist and companionHarvard Business School Digital Initiative
-
Do Mental Health Chatbots Work?Healthline
-
The wellness industry’s risky embrace of AI-driven mental health careThe Brookings Institution
-
I actually Kind of Love My Chatbot TherapistLifehacker
-
Your AI Chatbot Therapist Isn’t Sure What It’s DoingGizmodo
-
Mental health chatbot Woebot gets $90m boostSilicon Republic
-
This mental health app wants to improve your moodCreative Bloq
-
Making Mental Health Radically Accessible: A Conversation with Allison Darcy, Founder and President of Woebot HealthAI fund
-
I bonded with a quirky robot after chatting to it about my fearsNew Scientist
-
Mental health chatbot Woebot could be adapted to tackle substance useMobiHealthNews
Comentários
Tem um comentário a fazer? Nos diga.