Ginger bills itself as an emotional support app. Offered mostly through employers, organizations, and health plans, this app lets users access text chat with coaches, video therapy and psychiatry sessions, and a library of self-care resources. So, whether you're feeling anxious about that big work meeting or the scary state of things in the world, depression, anxiety, or more, Ginger says it can offer it to you "all from the privacy of your smartphone." Which, not sure that makes us feel so good about things as we all know our smartphones are not all that private. Ginger does have some HIPAA compliance with the data they collect...which is good. But know that not all the personal information they collect on your is covered by HIPAA. And they do say they can disclose some of your data to the provider (ie. employer) or health plan you sign up through, so just be aware of that. Is Ginger bad for privacy? Well, it's hard to tell exactly, but you could potentially be sharing a lot of personal information is those chat-based coaching sessions that might not be covered by HIPAA, so do be careful.
What could happen if something goes wrong?
At first glance, the Ginger app seems alright privacy-wise. According to their many privacy policies, they don’t sell your data, they don't share your data widely with third parties for advertising purposes, they practice good cybersecurity hygiene, and they say they’ll delete your personal information upon request. All this looks pretty good to us. Good work Ginger.
We do want to call out one thing about Ginger -- and many other mental health apps -- that raises a concern for us. And it's those text-based chats you have with "coaches." Many people expect their personal conversations through online therapy sites to be private, or to be covered by stricter privacy laws protecting health care data, like HIPAA in the United States. But conversations with unlicensed coaches are often not required to be covered by these stricter privacy laws. What does that mean? Pretty much it means, beware what you share in text-based chat communications with online mental health apps unless you have been 100% guaranteed they are covered by strict health privacy laws like HIPAA. Otherwise, they could be used for things like improving the app, advertising or marketing, or turned in to "anonymized" data to be used for many purposes.
Yes, we have overarching concerns about online chat transcripts with all mental health apps. But what is actually going on with Ginger? Much of what happens inside the Ginger app is text-based coaching that’s available 24/7. A promotional video describes those chats as providing “guidance through tough emotional challenges.” Another part of the website suggests your coach can help you through some pretty tough stuff, like if you’re struggling with depression.
But what you should know, before your thumbs set that tiny digital keypad ablaze, is that all of those DMs back and forth become “Care” or “Coaching Data” that’s stored by Ginger. And though the services are marketed together with Headspace as “Mental healthcare,” coaching isn’t therapy. That’s because the coaches aren’t required to be licensed therapists, but it also means that those conversations aren’t necessarily covered by stricter health privacy laws.
For example, according to Ginger's FAQ, multiple coaches can be assigned to help you at different times, depending on availability and other factors, forming a “care team.” And all those coaches can talk to each other about what you’ve been talking to them about, “so they’ll know what you’ve been working on with another coach and where you left off.”
Ginger also mentions that “[their] unique platform analyzes chat transcripts and other data points to help coaches provide effective support for each member in their care.” And in their International Coaching Privacy Statement they say they can use that Coaching Data to “evaluate the quality and progress of our coaching program, and optimize [their] coaching services.”
So between the paper trail, the inter-coach-conferring, the technological analysis, and the somewhat ambiguous description of evaluating how you’re progressing and optimizing the services, those transcripts are being shared and used it lots of ways... which means they aren't what you might expect to call "private." Plus, it’s possible that they're part of the personal information that Ginger is allowed to share in an anonymized and aggregate form. In the US version of the Privacy Policy, they mention that that could include your (de-identified) health information too.
Given that Ginger is offered through employers, and that access to licensed therapists is available, but at a higher cost and sometimes only if your employer sponsors that, it seems most Ginger users will rely on these text-based "emotional coach" conversations. So remember, those conversations are likely not nearly as private as the video-based conversations you can sometime get through Ginger with a license therapist (those cost a lot more and sometimes require your employer to sponsor that feature). As one Ginger user pointed out, "When a patient has no way of knowing who at a healthcare practice knows the details of their mental health concerns, it means there are an undisclosed collection of people wandering around the world with knowledge of and access to that patient’s most private struggles, habits, and thoughts. This also means confidentiality — and any subsequent breach thereof — is nearly impossible to track."
What's the worst that could happen with Ginger? Well, the idea that your "emotional coaching" chat transcripts exist is enough to give us the privacy heebie jeebies. Knowing they could be shared around internally at Ginger and then having to trust an employer-sponsored app to keep everything private, well, we're glad Ginger has a decent privacy policy. We're still wary of such a model for mental health care and suggest you think this through before signing up and sharing that you really hate your boss with that emotional coach you're chatting with at 2am.
Tips to protect yourself
- Do not give access to your photos and video or camera
- Do not log in using third-party accounts
- Do not connect to any third party via the app, or at least make sure that a third party employs decent privacy practices
- Do not give consent for sharing of personal data for marketing and advertisement.
- Choose a strong password! You may use a password control tool like 1Password, KeePass etc
- Do not use social media plug-ins.
- Use your device privacy controls to limit access to your personal information via app (do not give access to your camera, microphone, images, location unless neccessary)
- Keep your app regularly updated
- Limit ad tracking via your device (e.g. on iPhone go to Privacy -> Advertising -> Limit ad tracking) and biggest ad networks (for Google, go to Google account and turn off ad personalization)
- Request your data be deleted once you stop using the app. Simply deleting an app from your device usually does not erase your personal data.
- When starting a sign-up, do not agree to tracking of your data if possible.
Can it snoop on me?
Camera
Device: N/A
App: Yes
Microphone
Device: N/A
App: Yes
Tracks location
Device: N/A
App: No
What can be used to sign up?
Yes
Phone
No
Third-party account
Yes
For employer-sponsored benefits, a unique access code is required to sign up for the account, which is sent to members directly – or, a combination of first name, last name, date of birth, ZIP code, and work email address.
What data does the company collect?
Personal
Name, email address, mailing address, phone number, payment card information, and any other Personal Information you voluntarily submit through the online registration form
Body related
Social
How does the company use this data?
How can you control your data?
What is the company’s known track record of protecting users’ data?
No known privacy or security incidents discovered in the last 3 years.
Child Privacy Information
Can this product be used offline?
User-friendly privacy information?
Links to privacy information
Does this product meet our Minimum Security Standards?
Encryption
Strong password
Security updates
Manages vulnerabilities
Ginger has an active bug bounty program in place through HackerOne. Vulnerabilities can be reported through HackerOne. Researchers can also submit the security vulnerabilities directly to [email protected]
Privacy policy
https://www.ginger.com/privacy-policy
AI assists users by directing them to the right kinds of content that are relevant to their issues as well as by matching them to the caregivers most likely to give them the best outcome for their concerns and interests.
Is this AI untrustworthy?
What kind of decisions does the AI make about you or for you?
Is the company transparent about how the AI works?
Does the user have control over the AI features?
Dive Deeper
-
When healthcare companies like Ginger.io share our information with countless members of the company, what happens to our privacy?Medium
-
Lyra vs Modern Health vs Ginger: What’s the Best Mental Health Platform for Employees?Fin vs Fin
Comments
Got a comment? Let us hear it.