Ginger bills itself as an emotional support app. Offered mostly through employers, organizations, and health plans, this app lets users access text chat with coaches, video therapy and psychiatry sessions, and a library of self-care resources. So, whether you're feeling anxious about that big work meeting or the scary state of things in the world, depression, anxiety, or more, Ginger says it can offer it to you "all from the privacy of your smartphone." Which, not sure that makes us feel so good about things as we all know our smartphones are not all that private. Ginger does have some HIPAA compliance with the data they collect...which is good. But know that not all the personal information they collect on your is covered by HIPAA. And they do say they can disclose some of your data to the provider (ie. employer) or health plan you sign up through, so just be aware of that. Is Ginger bad for privacy? Well, it's hard to tell exactly, but you could potentially be sharing a lot of personal information is those chat-based coaching sessions that might not be covered by HIPAA, so do be careful.
O que pode acontecer se algo der errado?
At first glance, the Ginger app seems alright privacy-wise. According to their many privacy policies, they don’t sell your data, they don't share your data widely with third parties for advertising purposes, they practice good cybersecurity hygiene, and they say they’ll delete your personal information upon request. All this looks pretty good to us. Good work Ginger.
We do want to call out one thing about Ginger -- and many other mental health apps -- that raises a concern for us. And it's those text-based chats you have with "coaches." Many people expect their personal conversations through online therapy sites to be private, or to be covered by stricter privacy laws protecting health care data, like HIPAA in the United States. But conversations with unlicensed coaches are often not required to be covered by these stricter privacy laws. What does that mean? Pretty much it means, beware what you share in text-based chat communications with online mental health apps unless you have been 100% guaranteed they are covered by strict health privacy laws like HIPAA. Otherwise, they could be used for things like improving the app, advertising or marketing, or turned in to "anonymized" data to be used for many purposes.
Yes, we have overarching concerns about online chat transcripts with all mental health apps. But what is actually going on with Ginger? Much of what happens inside the Ginger app is text-based coaching that’s available 24/7. A promotional video describes those chats as providing “guidance through tough emotional challenges.” Another part of the website suggests your coach can help you through some pretty tough stuff, like if you’re struggling with depression.
But what you should know, before your thumbs set that tiny digital keypad ablaze, is that all of those DMs back and forth become “Care” or “Coaching Data” that’s stored by Ginger. And though the services are marketed together with Headspace as “Mental healthcare,” coaching isn’t therapy. That’s because the coaches aren’t required to be licensed therapists, but it also means that those conversations aren’t necessarily covered by stricter health privacy laws.
For example, according to Ginger's FAQ, multiple coaches can be assigned to help you at different times, depending on availability and other factors, forming a “care team.” And all those coaches can talk to each other about what you’ve been talking to them about, “so they’ll know what you’ve been working on with another coach and where you left off.”
Ginger also mentions that “[their] unique platform analyzes chat transcripts and other data points to help coaches provide effective support for each member in their care.” And in their International Coaching Privacy Statement they say they can use that Coaching Data to “evaluate the quality and progress of our coaching program, and optimize [their] coaching services.”
So between the paper trail, the inter-coach-conferring, the technological analysis, and the somewhat ambiguous description of evaluating how you’re progressing and optimizing the services, those transcripts are being shared and used it lots of ways... which means they aren't what you might expect to call "private." Plus, it’s possible that they're part of the personal information that Ginger is allowed to share in an anonymized and aggregate form. In the US version of the Privacy Policy, they mention that that could include your (de-identified) health information too.
Given that Ginger is offered through employers, and that access to licensed therapists is available, but at a higher cost and sometimes only if your employer sponsors that, it seems most Ginger users will rely on these text-based "emotional coach" conversations. So remember, those conversations are likely not nearly as private as the video-based conversations you can sometime get through Ginger with a license therapist (those cost a lot more and sometimes require your employer to sponsor that feature). As one Ginger user pointed out, "When a patient has no way of knowing who at a healthcare practice knows the details of their mental health concerns, it means there are an undisclosed collection of people wandering around the world with knowledge of and access to that patient’s most private struggles, habits, and thoughts. This also means confidentiality — and any subsequent breach thereof — is nearly impossible to track."
What's the worst that could happen with Ginger? Well, the idea that your "emotional coaching" chat transcripts exist is enough to give us the privacy heebie jeebies. Knowing they could be shared around internally at Ginger and then having to trust an employer-sponsored app to keep everything private, well, we're glad Ginger has a decent privacy policy. We're still wary of such a model for mental health care and suggest you think this through before signing up and sharing that you really hate your boss with that emotional coach you're chatting with at 2am.
Dicas para se proteger
- Do not give access to your photos and video or camera
- Do not log in using third-party accounts
- Do not connect to any third party via the app, or at least make sure that a third party employs decent privacy practices
- Do not give consent for sharing of personal data for marketing and advertisement.
- Choose a strong password! You may use a password control tool like 1Password, KeePass etc
- Do not use social media plug-ins.
- Use your device privacy controls to limit access to your personal information via app (do not give access to your camera, microphone, images, location unless neccessary)
- Keep your app regularly updated
- Limit ad tracking via your device (e.g. on iPhone go to Privacy -> Advertising -> Limit ad tracking) and biggest ad networks (for Google, go to Google account and turn off ad personalization)
- Request your data be deleted once you stop using the app. Simply deleting an app from your device usually does not erase your personal data.
- When starting a sign-up, do not agree to tracking of your data if possible.
Pode me bisbilhotar?
Câmera
Dispositivo: Não aplicável
Aplicativo: Sim
Microfone
Dispositivo: Não aplicável
Aplicativo: Sim
Rastreia localização
Dispositivo: Não aplicável
Aplicativo: Não
O que pode ser usado para se inscrever?
Sim
Celular
Não
Conta de terceiros
Sim
For employer-sponsored benefits, a unique access code is required to sign up for the account, which is sent to members directly – or, a combination of first name, last name, date of birth, ZIP code, and work email address.
Que dados a empresa coleta?
Pessoal
Name, email address, mailing address, phone number, payment card information, and any other Personal Information you voluntarily submit through the online registration form
Relacionado ao corpo
Social
Como a empresa usa esses dados?
Como você pode controlar seus dados?
Qual é o histórico conhecido da empresa na proteção de dados dos usuários?
No known privacy or security incidents discovered in the last 3 years.
Informações de privacidade infantil
Este produto pode ser usado offline?
Informações de privacidade fáceis de entender?
Links para informações de privacidade
Este produto atende aos nossos padrões mínimos de segurança?
Criptografia
Senha forte
Atualizações de segurança
Gerencia vulnerabilidades
Ginger has an active bug bounty program in place through HackerOne. Vulnerabilities can be reported through HackerOne. Researchers can also submit the security vulnerabilities directly to [email protected]
Política de privacidade
https://www.ginger.com/privacy-policy
AI assists users by directing them to the right kinds of content that are relevant to their issues as well as by matching them to the caregivers most likely to give them the best outcome for their concerns and interests.
Esta inteligência artificial não é confiável?
Que tipo de decisões a inteligência artificial faz sobre você ou por você?
A empresa é transparente sobre como funciona a inteligência artificial?
O usuário tem controle sobre os recursos da inteligência artificial?
Mergulhe mais fundo
-
When healthcare companies like Ginger.io share our information with countless members of the company, what happens to our privacy?Medium
-
Lyra vs Modern Health vs Ginger: What’s the Best Mental Health Platform for Employees?Fin vs Fin
Comentários
Tem um comentário a fazer? Nos diga.