Warning: *Privacy Not Included with this product
If you've been online these days, chances are you've seen or heard one of the many ads Talkspace runs all over the place -- in podcasts, on TV, on streaming services, on Facebook -- featuring celebrities such as Michael Phelps and Demi Lovato. Talkspace offers users access to online therapy, couples therapy, teen therapy, and psychiatric services. According to Talkspace, it's as easy as taking a brief assessment, picking a provider, and then starting therapy. According to reviews left on the Google and Apple app store pages, it's not nearly so easy as that. Reports of long wait times to be matched with a therapist that fits your needs, unresponsive therapists, and even people reporting being ghosted by their therapist seem to be common enough to raise concern. According to Talkspace, feeling better starts with a single message...here's hoping the 2020 report of mining those messages with your therapist for your data have sorted themselves out. Yes, we found Talkspace does raise a number of privacy concerns.
What could happen if something goes wrong?
First reviewed April 20, 2022. Review updated, April 25, 2023
We're sad to say, Talkspace has not improved their privacy practices since we reviewed them in 2022. If anything, we have more concerns about them this year than last. At least last year, it looked like all users, regardless of what privacy laws they lived under, could have their data deleted. This year, Talkspace has changed their privacy policy so that it now only mentions deletion rights for users in California, EU, EEA or UK. This means Talkspace now earns all three of our privacy dings. That's not good.
We will give Talkspace credit for reaching out and communicating with us about their privacy practices. And we have seen them work over the past year to try and clarify things in their privacy policy that raised concerns. We appreciate that, for sure. However, Talkspace does collect a good deal of personal information, and can share some personal information for targeted advertising purposes (but not any health related data after a user becomes a client, which is important and good). They also ask users straight away to take a questionnaire where pretty sensitive information is gather about things like a users' mental state, gender and gender identity, date of birth, and more. No privacy policy is presented before the answers to those questions are collected so you can understand how that information could be used. Indeed, Talkspace's Privacy Notice for California Users (which lays out more detailed privacy information than in their general privacy policy), states that Talkspace uses "inferences about your interests derived from your responses to surveys completed prior to becoming a subscriber" for marketing purposes, including for tailored advertisement. Uhg.
Talkspace has some flags on their track record for protecting and respecting their users' privacy that concern us as well. In 2022, three US Senators sent a letter to Talkspace expressing concerns about their use of patient's personal health data and requested more information about their data sharing and privacy practices. These Senators also sent a similar letter to BetterHelp (and good to note here that BetterHelp got in trouble with the US regulator agency the FTC, whereas no such judgement has been issued against Talkspace).
Unfortunately, we have to say that Talkspace's privacy practices seem no better to us in 2023, and maybe even seem a bit worse. They do make it clear in their privacy policy they take stronger measures to protect the privacy of users who pay up and become patients, but they still raise privacy concerns for us. Especially for saying they can collect things like gender identify and sexual orientation in the survey responses you provide before you become a subscriber and then using and sharing that for marketing purposes.
Read our review from 2022:
Talkspace comes with a fair amount of baggage when it comes to protecting and respecting their users' privacy and security. In 2020, the NY Times reported on allegations from former Talkspace employees about questionable marketing practices and questionable handling of private therapy chat logs. The founders of Talkspace disputed some of the claims made in the article. Consumer Reports reported in 2021 that Talkspace does collect data from Facebook for ads, although they said they only use information about a person before they start therapy. There's also this 2019 article from Mashable detailing more questionable marketing practices and CNBC reported in 2020 about privacy, transparency, and oversight concerns for therapy apps like Talkspace. All of this reporting leaves us concerned. And then there is Talkspace's own privacy policy, privacy notice, and additional privacy statements that leave us concerned. And sometimes scratching our head in confusion too.
Talkspace says they can collect a lot of personal information on users, including name, email, address, phone number, gender, relationship status, employer, geolocation information, chat transcripts and more. While Talkspace says in their privacy notice they will not sell your medical information to others, we could find no promise to not sell non-medical information in their privacy policy (except for residents of California and those in Europe and the UK living under GDPR privacy laws). This is something we like to see stated clearly. (Update: On June 14th, 2022, Talkspace updated their privacy policy to state, "Talkspace does not sell client information to third parties.")
They do say they can use your personal information for marketing, tailored advertising, and research purposes. And while your medical information is protected under HIPAA privacy laws -- which is good -- Talkspace also says "your written authorization will be required for uses and disclosures of psychotherapy notes and uses and disclosures of your protected health information for marketing." Which indicates that Talkspace could ask for your permission to use your health info and therapy notes for marketing purposes. Which feels like bad form to us. (Update: On June 14, 2022, Talkspace updated their privacy policy to remove any mention of "psychotherapy notes." They use the term "chat data" to refer to data you provide when you use the service. They say they can use this data "To conduct clinical and other academic research, internally and with approved research partners and identify summary trends or insights for use in external communications (where direct identifiers such as name and contact details have been removed, or pursuant to explicit patient authorization)."
We did reach out multiple times to try to get answers to our privacy and security related questions to the email address Talkspace lists in their privacy policy for privacy related questions. Unfortunately, they never responded to our questions. After we published our review of Talkspace, they did respond to our questions and confirm they take all the security steps necessary to meet our Minimum Security Standards, including a way for people to report security vulnerabilities. However, we did find a report from Techcrunch in 2020 where one security researcher found a bug in Talkspace, tried to report it to them, and in response Talkspace threatened to sue the researcher. Which feels like even more bad form to us.
So when Talkspace says in their privacy policy, "If you do not want us to share personal data or feel uncomfortable with the ways we use information in order to deliver our Services, please do not use the Services" we think that's pretty good advice.
Tips to protect yourself
- Do not give authorization to use or disclose your medical information. If you have given it already (or if you are unsure), revoke it by sending an email to [email protected].
- Ask Talkspace to limit what they use or share with your insurance by writing to record [email protected].
- Choose a strong password! You may use a password control tool like 1Password, KeePass etc
- Use your device privacy controls to limit access to your personal information via app (do not give access to your camera, microphone, images, location unless neccessary)
- Keep your app regularly updated
- Limit ad tracking via your device (e.g. on iPhone go to Privacy -> Advertising -> Limit ad tracking) and biggest ad networks (for Google, go to Google account and turn off ad personalization)
- Request your data be deleted once you stop using the app. Simply deleting an app from your device usually does not erase your personal data.
- When starting a sign-up, do not agree to tracking of your data if possible.
Can it snoop on me?
Camera
Device: N/A
App: Yes
Microphone
Device: N/A
App: Yes
Tracks location
Device: N/A
App: No
What can be used to sign up?
Yes
Phone
No
Third-party account
No
Name, address, date of birth, phone number, gender, email, relationship status, employer (sometimes), insurance information.
What data does the company collect?
Personal
Audio/video, medical information (includes your medical history, diagnoses, treatments, current medical condition, and use of prescription medications).
Body related
Audio/video, medical information (includes your medical history, diagnoses, treatments, current medical condition, and use of prescription medications).
Social
Information on friends you refer, on your partner (if you use couples therapy).
How does the company use this data?
How can you control your data?
What is the company’s known track record of protecting users’ data?
In 2023, according to a class action lawsuit filed March 1, 2023 in California federal court, Talkspace Inc. deceives consumers into believing the virtual therapy company has sufficient therapists in its network to meet demand and that new patients will promptly be matched with a suitable therapist.
In 2022, three US Senators sent a letter to Talkspace expressing concerns about their use of patient's personal health data and requested more information about their data sharing and privacy practices. These Senators also sent a similar letter to BetterHelp.
In 2020, the New York Times reported that former employees and therapists at Talkspace told The New York Times that anonymized conversations between medical professionals and their clients were regularly reviewed by the company so that they could mine them for information. Two former employees told the Times that Talkspace data scientists mine client transcripts and share common phrases with the company's marketing team to better attract potential customers. Talkspace's founders disputed some of the NY Times' findings.
In 2020, TechCrunch reported a security researcher tried to reach out to Talkspace to report a bug he found and the company responded by threatening to sue the security researcher.
Child Privacy Information
Can this product be used offline?
User-friendly privacy information?
Talkspace has multiple privacy documents written in fairly complicated language.
Links to privacy information
Does this product meet our Minimum Security Standards?
Encryption
Strong password
Security updates
Manages vulnerabilities
Talksapce says any feedback regarding the security of the platform should be sent to [email protected]
In 2020, it was reported Talkspace threatened to sue a security researcher over a bug report.
Privacy policy
"Matching Algorithm. During onboarding we ask you to provide information so that we can assess your condition and incorporate your preferences. We then leverage a proprietary algorithm (and/or support from a Talkspace consultant) to match you to a provider.
Optimizing Diagnosis and Treatment. Throughout your experience, your provider uses the Talkspace Services to manage your diagnosis and treatment plan. The advanced machine learning features of our proprietary Services include natural language processing of communications with therapists. A core focus of our machine learning strategy is to provide the therapist with insights on patient needs and behaviours and offer techniques and suggestions that we believe are likely to maximize clinical outcomes."
Is this AI untrustworthy?
What kind of decisions does the AI make about you or for you?
Is the company transparent about how the AI works?
Does the user have control over the AI features?
Dive Deeper
-
Talkspace class action claims company does not have enough therapists to meet demandTop Class Actions
-
Talk therapy apps face new questions about data collection from senatorsThe Verge
-
Warren, Booker, Wyden Call on Mental Health Apps to Provide Answers on Data Privacy and Sharing Practices that May Put Patients’ Data at Risk of ExploitationElizabeth Warren
-
Letter from United States SenateUnited States Senate to BetterHelp's Founder and President
-
Senators Question Mental Health App Providers About Privacy and Data Sharing PracticesThe HIPAA Journal
-
Mental health apps have terrible privacy protections, report findsThe Verge
-
'Creepy' Mental Health And Prayer Apps Are Sharing Your Personal DataForbes
-
Mental health and prayer apps have some of the worst privacy protections, study claims, finding they 'track, share and capitalize' on users intimate thoughts and feelingsDaily Mail
-
BetterHelp vs Talkspace: Who’s best for online therapy?Inner Body
-
At Talkspace, Start-Up Culture Collides With Mental Health ConcernsNY Times
-
Talkspace Founders Respond to a New York Times Article.Medium
-
Mental health apps draw wave of new users as experts call for more oversightCNBC
-
Mental Health Apps Aren't All As Private As You May ThinkConsumer Reports
-
Talkspace threatened to sue a security researcher over a bug reportTech Crunch
-
How a dead veteran became the face of a therapy app's Instagram adMashable
-
The Therapy-App FantasyThe Cut
-
The Spooky, Loosely Regulated World of Online TherapyJezebel
-
Dramatic growth in mental-health apps has created a risky industryThe Economist
Comments
Got a comment? Let us hear it.