Review date: April 25, 2023


Mozilla says

People voted: Somewhat creepy

Welcome to the brave new world of mental health AI chatbots. Woebot, started in 2017 by psychologists and AI-researchers from Stanford, describes itself as "a choose-your-own-adventure self-help book that is capable of storing all of your entries, and gets more specific to your needs over time." Welp, the future is here, for better or worse. What exactly does therapy by emotional AI algorithm look like? Users download Woebot and the chatbot starts asking questions. Things like "How are you feeling?" and "What's going on in your world right now?" Woebot then uses "natural language processing, psychological expertise, excellent writing, and sense of humor to create the experience of a friendly informative conversation."

Based on reviews we saw in the app stores, some people feel more comfortable talking to a bot than a real person. And others found long wait times or high costs kept them from talking to a real person, so this free chatbot was what they turned to instead. At least one study has shown AI chatbots can be successful in reducing anxiety and depression. Andinvestors dropped around $100 million in the company recently, so it seems AI chatbots are here to stay. What does the privacy of Woebot look like? We're pleased to say, in 2023, Woebot's privacy practices seem pretty good to us.

What could happen if something goes wrong?

First reviewed April 20, 2022. Review updated, April 25, 2023

When we first reviewed Woebot in 2022 we had some concerns about their privacy. However, after we published our review, Woebot reached out to us and open up a conversation to address our concerns. The result of those conversations were updates to their privacy policy that better clarify how they protect their users' privacy. So now here in 2023, we're happy to say, we feel pretty good about Woebot's privacy. This is exactly the change we love to see in the world. Thank you Woebot.

The biggest change we saw Woebot make to their privacy policy was to clarify that all users of their service have the same rights to access and delete their data. Their privacy policy now reads, "Anyone who uses the services can access, correct, or delete their personal data regardless of where they live or are physically located." This might seem like a small change, but ensuring all users, whether they live under strong privacy laws or not, have the same rights to access and delete data is a big deal to us here at *Privacy Not Included. We've actually shared Woebot's privacy policy with other companies when we asked them to clarify that all users have these rights, because we found Woebot's language so easy and simple to understand.

Over the past year Woebot says they also worked to simplify the language in their privacy policy. And in our review in 2023, we were happy to find it was clearer and easier to understand. This year they received none of our privacy or security dings, which is great. So, good work Woebot! We appreciate your willingness to listen, change, and work to protect and respect your users' privacy.

Read our 2022 review:

How good are AI chatbots -- like Woebot -- that you share all sorts of personal and emotional information with at protecting privacy? That's a very good question. One of the biggest risks with AI chatbots is keeping the information you share with them during your conversations secure. That means making sure no one else can read the contents of the conversations you have with the bot. But AI algorithms need to learn to get better at chatting with you. So when Woebot (or any AI chatbot) says they are "keeping previous chats in mind to provide the most beneficial and timely therapeutic suggestions," what does that mean? According to Woebot, that means they review de-identified portions of conversations and compare the AI-suggested path to the path chosen by the user to retrain their algorithms. Here's hoping those de-identified conversations are truly de-identified.

We do know Woebot says all your communications are encrypted both in transit and at rest, which is good. We don't know exactly how they "keep your previous chats in mind" to improve their therapy bot though, and that is a little worrisome. Something we, and other experts, alwaysworry about is racial, gender, and cultural bias making their way into AI algorithms. This would not be good for a therapy app. Does Woebot have a bias issue in their algorithm? We sure hope not. But we also can't tell. This isn't unique to Woebot though. We generally can't determine if there is bias in any proprietary AI algorithm. It's also good to remember that while your personal chats with a human therapist are covered by strict health privacy laws like HIPAA, your personal chats with an AI chatbot aren't always similarly protected. Woebot does say that they "treat all user data as Protected Health Information and adhere to all HIPAA and GDPR requirements."

How does Woebot's privacy policies look to us? We have a few concerns. Woebot says they can collect personal info like name, email, IP address, "inferences drawn from other personal information to create a profile about a consumer," and the information you give them in your conversations. They also say they can "obtain information about you from other sources, including through third party services and organizations to supplement information provided by you." So, Woebot can collect a good deal of personal information, add to the information you give them with even more information gathered from third parties. Then they say they can share some of this information with third parties, including insurance companies and a seemingly broad category they call "external advisors." They also say in their privacy policy they share some of your information, such as identifiers and network internet activity, with marketing partners for advertising purposes. We were a little confused by this because they also state in their privacy policy, "We never, ever sell or share your data with advertisers." Those two statements seem in conflict to us.

Finally Woebot says they aggregate or de-identified your personal information, including location and device information, and share it with third parties. This is a pretty common practice but we also must remind you that it has been found to be pretty easy to de-anonymize such data, especially if location data is included.

What's the the worst that could happen with Woebot? Hopefully nothing. But, what if the previous chats they keep in mind to provide you more beneficial therapeutic suggestions end up not being completely de-identified because you mentioned your dog HuskerDoodle in them and no one else has a dog named HuskerDoodle and that chat conversation gets leaked and the world knows all about your relationship with HuskerDoodle? OK, so this isn't likely to happen. Still, it's a good reminder that anything you share on the internet isn't 100% secure, that chats that are de-identified could, potentially, be re-identified under some circumstances, and Woebot is a for-profit company as well as your helpful mental health friend. Their own privacy policy states, "Unfortunately, no system is 100% secure, and we cannot ensure or warrant the security of any personal data you provide to us. To the fullest extent permitted by applicable law, we do not accept liability for unintentional disclosure." That's a good reminder to be careful out there folks.

Tips to protect yourself

  • Do not log in using third-party accounts
  • Do not connect to any third party via the app, or at least make sure that a third party employs decent privacy practices
  • Do not give consent for sharing of personal data for marketing and advertisement.
  • Choose a strong password! You may use a password control tool like 1Password, KeePass etc
  • Do not use social media plug-ins.
  • Use your device privacy controls to limit access to your personal information via app (do not give access to your camera, microphone, images, location unless neccessary)
  • Keep your app regularly updated
  • Limit ad tracking via your device (eg on iPhone go to Privacy -> Advertising -> Limit ad tracking) and biggest ad networks (for Google, go to Google account and turn off ad personalization)
  • Request your data be deleted once you stop using the app. Simply deleting an app from your device usually does not erase your personal data.
  • When starting a sign-up, do not agree to tracking of your data if possible.
  • mobile

Can it snoop on me? information


Device: N/A

App: No


Device: N/A

App: No

Tracks location

Device: N/A

App: No

What can be used to sign up?

What data does the company collect?

How does the company use this data?

"We never sell or share your personal data with advertisers. Never have, never will."

"We will not share:

Your conversational interactions with Woebot, like what you write or your path through a conversation, unless you give us your consent to do so.

We often share:

De-identified and/or aggregated data about how Woebot users use the app and its effectiveness.

If you are using the Services through a clinical program or as part of a study and you consent, we may also share:

Identifiable data about how you’re doing such as survey responses, mood trends, or your confirmation that Woebot has understood a concerning entry that’s beyond what it can support"

"We may use your personal data to create de-identified and/or aggregated data, like approximate location information, information about the device you use to access our Services, information about conversational trends, or other analyses we create. De-identified and/or aggregated data is not personal data and we may use and share this data as permitted by applicable law, such as with academic partners. We never share your transcripts with Woebot without your consent, even de-identified."

"We use third party service providers, like Amazon Web Services, that help us provide our Services. These third parties may have limited and controlled access to personal data in connection with the services they provide such as hosting or customer service. The use of personal data by service providers outside of agreed-upon service they provide is prohibited."

How can you control your data?

"Anyone who uses the services can access, correct, or delete their personal data regardless of where they live or are physically located."

"You control your personal data. Request to access it, correct it, or delete it, whenever you want. Share as much or as little as you like."

"The Services may contain links to third party websites or applications not covered by this privacy policy. We do not endorse, screen, or approve, and are not responsible for, the privacy practices or content of such other websites or applications. Providing personal data to third party websites or applications is at your own risk."

What is the company’s known track record of protecting users’ data?


No known privacy or security incidents discovered in the last 3 years.

Child Privacy Information

"The services are not directed to children (defined as under the age of 13 or another age as required by local law), and we do not knowingly collect personal data from children. If you learn that your child has provided us with personal data without your consent, please Contact Us. If we learn that we have collected a child’s personal data in violation of applicable law, we will delete that personal data (unless we have a legal obligation to keep it) and close the child’s account."

Can this product be used offline?


User-friendly privacy information?


Links to privacy information

Does this product meet our Minimum Security Standards? information




All data is encrypted both at rest with AES-256 or better and in transit with TLS 1.2 or better.

Strong password


Woebot has added a strong password requirment of a minimum of 10 characters; 1 uppercase character; 1 lowercase character; 1 number; and 1 special character.

Security updates


Woebot has a scheduled monthly patching cycle.

Manages vulnerabilities


Woebot says they respond to emergency vulnerabilities. They test the security of design by performing and remediating findings of penetration tests, vulnerability assessments, internal compliance reviews and more. To report a security vulnerability, Woebot says users can message them directly in the app, email [email protected], or use their contact form.

Privacy policy


Does the product use AI? information


Woebot says they are "keeping previous chats in mind to provide the most beneficial and timely therapeutic suggestions."

According to Woebot, this means they periodically review de-identified portions of conversations and compare the AI-suggested path to the path chosen by the user. When these paths diverge, they retrain their algorithms using the additional de-identified data to help Woebot’s conversational ability improve and learn.

Is this AI untrustworthy?

Can’t Determine

What kind of decisions does the AI make about you or for you?

Therapeutic suggestions

Is the company transparent about how the AI works?


Woebit follows the FDA design control documentation which requires explicit description of the product and its methods of delivery, including transparency of the AI algorithms and how they ensure safety for users.

Does the user have control over the AI features?


Yes, the AI helps recommend conversation paths that might be most beneficial for the user, but the user is always given the choice of what path to follow.
*Privacy Not Included

Dive Deeper


Got a comment? Let us hear it.