Warning: *privacy not included with this product



Review date: July 28, 2022


Mozilla says

People voted: Super creepy

If you've been online lately, chances are you've seen or heard one of the many ads Talkspace runs all over the place -- in podcasts, on TV, on streaming services, on Facebook -- featuring celebrities such as Michael Phelps and Demi Lovato. Talkspace offers users access to online therapy, couples therapy, teen therapy, and psychiatric services. According to Talkspace, it's as easy as taking a brief assessment, picking a provider, and then starting therapy. According to reviews left on the Google and Apple app store pages, it's not nearly so easy as that. Reports of long wait times to be matched with a therapist that fits your needs, unresponsive therapists, and even people reporting being ghosted by their therapist seem to be common enough to raise concern. According to Talkspace, feeling better starts with a single's hoping the 2020 report of mining those messages with your therapist for your data have sorted themselves out. Yes, we found Talkspace does raise a number of privacy concerns.

What could happen if something goes wrong?

Talkspace comes with a fair amount of baggage when it comes to protecting and respecting their users' privacy and security. In 2020, the NY Times reported on allegations from former Talkspace employees about questionable marketing practices and questionable handling of private therapy chat logs. The founders of Talkspace disputed some of the claims made in the article. Consumer Reports reported in 2021 that Talkspace does collect data from Facebook for ads, although they said they only use information about a person before they start therapy. There's also this 2019 article from Mashable detailing more questionable marketing practices and CNBC reported in 2020 about privacy, transparency, and oversight concerns for therapy apps like Talkspace. All of this reporting leaves us concerned. And then there is Talkspace's own privacy policy, privacy notice, and additional privacy statements that leave us concerned. And sometimes scratching our head in confusion too.

Talkspace says they can collect a lot of personal information on users, including name, email, address, phone number, gender, relationship status, employer, geolocation information, chat transcripts and more. While Talkspace says in their privacy notice they will not sell your medical information to others, we could find no promise to not sell non-medical information in their privacy policy (except for residents of California and those in Europe and the UK living under GDPR privacy laws). This is something we like to see stated clearly. (Update: On June 14th, 2022, Talkspace updated their privacy policy to state, "Talkspace does not sell client information to third parties.")

They do say they can use your personal information for marketing, tailored advertising, and research purposes. And while your medical information is protected under HIPAA privacy laws -- which is good -- Talkspace also says "your written authorization will be required for uses and disclosures of psychotherapy notes and uses and disclosures of your protected health information for marketing." Which indicates that Talkspace could ask for your permission to use your health info and therapy notes for marketing purposes. Which feels like bad form to us. (Update: On June 14, 2022, Talkspace updated their privacy policy to remove any mention of "psychotherapy notes." They use the term "chat data" to refer to data you provide when you use the service. They say they can use this data "To conduct clinical and other academic research, internally and with approved research partners and identify summary trends or insights for use in external communications (where direct identifiers such as name and contact details have been removed, or pursuant to explicit patient authorization)."

We did reach out multiple times to try to get answers to our privacy and security related questions to the email address Talkspace lists in their privacy policy for privacy related questions. Unfortunately, they never responded to our questions. After we published our review of Talkspace, they did respond to our questions and confirm they take all the security steps necessary to meet our Minimum Security Standards, including a way for people to report security vulnerabilities. However, we did find a report from Techcrunch in 2020 where one security researcher found a bug in Talkspace, tried to report it to them, and in response Talkspace threatened to sue the researcher. Which feels like even more bad form to us.

So when Talkspace says in their privacy policy, "If you do not want us to share personal data or feel uncomfortable with the ways we use information in order to deliver our Services, please do not use the Services," we think that's pretty good advice.

Tips to protect yourself

  • Do not give an authorization to use or disclose your medical information. If you have given it already (or if you are unsure), revoke it by sending an email to [email protected]
  • Ask Talkspace to limit what they use or share with your insurance by writing to record [email protected]
mobile Privacy warning Security A.I.

Can it snoop on me? information


Device: N/A

App: Yes


Device: N/A

App: Yes

Tracks location

Device: N/A

App: Yes

What can be used to sign up?

What data does the company collect?

How does the company use this data?

UPDATE: On June 14th, 2022, Talkspace updated their privacy policy to state, "Talkspace does not sell client information to third parties."

Talkspace does not sell your medical information. Talkspace's privacy policy does not say if they sell personal information except for California residents or people living in Europe and the UK under GDPR privacy laws, which is something we like to see stated clearly.

Talkspace does say they can use personal information for marketing, tailored advertising, and research purposes.

Update: On June 14th, 2022 Talkspace updated their privacy policy to remove any mention of psychotherapy notes. Their updated policy now refers to "chat data" instead. Talkspace says they can use your chat data for things such as "to build, modify, and develop new products, features, and Services" and "To carry out quality assurance and compliance activities." Their privacy policy states they can ask for permission to use your protected health information for research purposes with your written authorization. Talkspace says they can use data you provide when you use the service, including chat data, "To conduct clinical and other academic research, internally and with approved research partners and identify summary trends or insights for use in external communications (where direct identifiers such as name and contact details have been removed, or pursuant to explicit patient authorization)."

(Note, this mention of potentially using psychotherapy notes for marketing purposes with permission was removed from the Talkspace's privacy policy with the June 14, 2022 update). With your written authorization, Talkspace may use and disclose your psychotherapy notes or protected health information for marketing. If you provide Talkspace an authorization to use or disclose your medical information, you may revoke that authorization in writing at any time by sending a revocation request to [email protected] If you revoke your authorization, they will no longer use or disclose your medical information about you for the reasons covered by your written authorization except to the extent that they have already acted in reliance on your authorization. We found no information on how clear is the authorization process.

Talkspace collects data from your chats (transcripts), audio and video communication, documents you share.

According to Talkspace’s general counsel, John Reilly, “Once a therapist/client relationship is established, no personally identifiable information is disclosed to third-party service providers about that user, unless the third party has signed a business associate agreement.”

Talkspace say they conform to HIPAA rules just as though you had sought help at a medical office.

How can you control your data?

You can request the following for Talkspace by sending an email to record [email protected]:
--See or get an electronic or paper copy of your medical record and other health information we have about you.
--Ask to correct your medical record. Talkspace may deny the request in specific circumstances.
--Request confidential communications (ask to contact in a certain way)
--Get a list of those with whom they've shared your information in the last 6 years.
--Ask Talkspace to limit what they use or share with others. Again, Talkspace may deny the request.

Talkspace "will retain your information in accordance with the appropriate statutory limitation periods as required by local law, in line with their legitimate business purposes for as long as your account is active or for as long as needed to provide you with the Services, as required in order to comply with Talkspace's legal obligations, a court order or to defend or pursue legal claims, in line with industry codes of practice, to resolve disputes and enforce their agreements." Wow, that is a long sentence! No more retention details are provided in the privacy notice.

In addition to your rights as a patient, you can ask Talkspace to stop sending marketing or promotional emails and mobile marketing communications from Talkspace, and limit the use of cookies, pixels, or web beacons.

What is the company’s known track record of protecting users’ data?

Needs Improvement

TheNew York Times reported in 2020 that former employees and therapists at Talkspace told The New York Times that anonymized conversations between medical professionals and their clients were regularly reviewed by the company so that they could mine them for information. Two former employees told the Times that Talkspace data scientists mine client transcripts and share common phrases with the company's marketing team to better attract potential customers. Talkspace's founders disputed some of the NY Times' findings.

In 2020, TechCrunch reported a security researcher tried to reach out to Talkspace to report a bug he found and the company responded by threatening to sue the security researcher.

Child Privacy Information

In the United States, Talkspace may collect information and may provide Services to minors ages 13 – 17 with the written authorization of a parent or guardian. Talkspace does not provide therapeutic services to minors outside of the US.

Can this product be used offline?


User-friendly privacy information?


Talkspace has two different privacy documents written in complicated language.

Links to privacy information

Does this product meet our Minimum Security Standards? information




Strong password


Security updates


Manages vulnerabilities


Talksapce says any feedback regarding the security of the plastform should be sent to [email protected]

In 2020, it was reported Talkspace threatened to sue a security researcher over a bug report.

Privacy policy


Does the product use AI? information


Matching Algorithm. During onboarding we ask you to provide information so that we can assess your condition and incorporate your preferences. We then leverage a proprietary algorithm (and/or support from a Talkspace consultant) to match you to a provider.

Optimizing Diagnosis and Treatment. Throughout your experience, your provider uses the Talkspace Services to manage your diagnosis and treatment plan. The advanced machine learning features of our proprietary Services include natural language processing of communications with therapists. A core focus of our machine learning strategy is to provide the therapist with insights on patient needs and behaviours and offer techniques and suggestions that we believe are likely to maximize clinical outcomes.

Is this AI untrustworthy?

Can’t Determine

What kind of decisions does the AI make about you or for you?

Matching you with a healthcare provider.
Provide therapists with insights on patient needs and behaviours and offer techniques and suggestions.

Is the company transparent about how the AI works?


Does the user have control over the AI features?



At Talkspace, Start-Up Culture Collides With Mental Health Concerns
NY Times
The therapy-by-text company made burner phones available for fake reviews and doesn’t adequately respect client privacy, former employees say.
Talkspace Founders Respond to a New York Times Article.
An article about our company, Talkspace, was recently published in the New York Times that we want to address because it misconstrues our work and makes false and uninformed assertions about patient privacy and certain marketing practices.
Mental health apps draw wave of new users as experts call for more oversight
Therapy app makers are rushing to meet a flood of new customers, but health experts remain divided on the regulatory path ahead as privacy and efficacy concerns mount.
Mental Health Apps Aren't All As Private As You May Think
Consumer Reports
Type “mental health” or a condition such as anxiety or depression into an app store search bar, and you can end up scrolling through endless screens of options. As a recent Consumer Reports investigation has found, these apps take widely varied approaches to helping people handle psychological challenges—and they are just as varied in how they handle the privacy of their users.
Talkspace threatened to sue a security researcher over a bug report
Tech Crunch
A security researcher said he was forced to take down a blog post describing an apparent bug in Talkspace’s website that gave him a year’s subscription for free, after the company rejected his findings and sent the researcher a legal threat.
How a dead veteran became the face of a therapy app's Instagram ad
Amanda Gumban was furious. The 17-year-old was scrolling through Instagram when she came across an unexpected sight. It had been four months since her sister, 21-year-old Airman First Class Lindsey Renee Gumban, had died by suicide. And yet, on this day in November, Amanda was confronted with her sister's face — staring back into hers in the form of an advertisement for a therapy app.
The Therapy-App Fantasy
The Cut
An overwhelming demand for counseling has spawned slickly marketed companies promising a service they cannot possibly provide.
The Spooky, Loosely Regulated World of Online Therapy
Starting treatment with Better Help, one of the most prominent “therapy-on-demand” apps to launch over the last few years, is easy, which is more or less the point. Like many of the businesses offering therapy online, the service promotes itself as a seamless way to access mental health services: “Message your therapist anytime, from anywhere.” In order to understand how Better Help handles its users’ data, we signed up for the service and monitored what kinds of information it was collecting and sending elsewhere.
Dramatic growth in mental-health apps has created a risky industry
The Economist
Customers’ “emotional data” can be hacked, and no one is checking if the apps work.


Got a comment? Let us hear it.