When you use a work-issued computer, you’re consenting to a certain level of employee monitoring by your job. Using that laptop or even connecting to your company’s internet can allow your workplace to collect data on how often you’re typing, which sites you visit, and potentially more. The emergence of employer-provided mental health apps renews this worry. While many may not realize it, companies can receive data not just from an employee's computer but also through job-provided mental health apps.

Several mental health apps provide de-identified and aggregated reports to companies that use its services through a hub. The information often includes registration rates, engagement rates, how often services are used, and what well-being topics most interest employees.

Find out more in our *Privacy Not Included mental health apps guide

“De-identified and aggregated data reporting on service utilization is commonplace within the industry,” noted a spokesperson with Modern Health.

This is both true — and not necessarily something we should be comfortable with, says Daniel Aranki, an assistant professor at the UC Berkeley School of Information. While the mental health app company can have good intentions, and an employer can use the data it receives ethically, there is an opportunity for misuse.

“Imagine one scenario, where if I’m an employer, I could use this data somehow as an indicator of productivity and divide bonuses between departments based on this data,” says Aranki. “That would be unfair.”

Employers, he adds, could subconsciously and negatively treat employees in various departments differently based on what they know about their collective mental health.

Companies that provide mental health apps benefit from the fact that many want better mental health care. According to the American Psychological Association, the demand for mental health treatments is increasing. Tara Sklar, director of the Health Law & Policy Program at the University of Arizona’s college of law, describes a difficult balance between providing something viewed as necessary but which comes with potential dangers.

“These mental health apps offer a needed service, which explains their high uptake and growth,” says Sklar. “What is less understood by many — including employers and employees — are the privacy implications.”

Examples of what these privacy implications are can vary. Sklar notes examples: “The likelihood that these types of applications could become increasingly invasive in collecting and repurposing this sensitive health information unbeknownst to the user that could have implications for their employment, for insurance policies, banking loans, and used in other ways that could discriminate or disadvantage the user, particularly as re-identification of data becomes increasingly common.”

The Insights Hub section of Modern Health’s app is intended to provide “employer-customers with insights and resources to support the mental health needs of their workforce,” says Modern Health’s spokesperson. According to Modern Health, the company works with a third-party data expert firm to mitigate the risk of tying user data to a person. Modern Health says its app provides no raw data to these experts.

But Modern Health isn’t the only game in town. There are also other mental health apps, like Calm. When a company purchases the enterprise version of the app, Calm for Business, the company’s HR department is given access to a dashboard that offers them access to aggregated, non-identifiable information. It includes information about how many employees have signed up, the most popular content among employees, and the most popular times to use Calm. “The Portal does not permit employers to view the specific activity of any individual employee,” says a Calm spokesperson.

Headspace for Work, the enterprise offering of the mental health app Headspace, also provides an administrative dashboard. This is where employers can see registration information — first name, last name, date of registration, and date of last use of the product. The idea is that this can help employers monitor who chooses to redeem a subscription and reallocate unused subscriptions while protecting user privacy, explains Alex Boisvert, chief technology officer of Headspace Health. “No part of the employer dashboard provides information into individual usage statistics about how a user engages with the app or the activities a user performs in the app,” says Boisvert. Individual metrics are not provided. “Instead, employers have access to aggregated and anonymized usage statistics to provide insight into general service usage trends, activities, and preferences.”

Meanwhile, when a company uses Talkspace for Business, the app provides companies with de-identified and aggregated “broad usage metrics,” says a Talkspace spokesperson. “[This] helps HR departments, specifically, know how many people are utilizing their service and if needed, encourage participation,” says the company.

Aranki notes that it seems logical that employers would be provided some information on whether or not the app they are paying for is being used. “Some sort of dashboard maybe is warranted — just a quick overview of how many people are using it, not what they’re doing — so employers can legitimately say it’s worth providing and paying for this,” says Aranki.

But do you really want your employer to encourage you to use a mental health app — or discuss your mental health with you at all? There’s some tension between wanting employer-provided mental health services and the reality that most mental health apps are going to provide at least some information to companies. After all, companies are buying these services because of demand.

“There is a high and growing demand for these types of applications and related digital technologies to support mindfulness and wellness,” says Sklar. “These attractive, easy-to-use platforms are likely going to be increasingly offered by employers to their employees to support them in an efficient, desirable way, which could also mitigate against burnout and increase productivity. However, employees should be aware that their use is being monitored by these applications.”

Aranki echoes the sentiment. “Some companies use these perks [mental health apps] as an incentive for good employees to apply to their company,” says Aranki. “That is the main reason they are providing them, to my understanding. It’s not the fact that they want to snoop on their employees. But if you provide the opportunity to give them this data, then they will likely do something with it. That’s just the nature of things.”

Paying for a subscription to a mental health app may help you avoid such privacy worries. You can also talk with your HR department and ask exactly how they plan on using the aggregate data provided.

“That is one of the things that consumers can do to influence change,” Aranki says. “Unfortunately, when it comes to privacy, we have little control over the practices of companies. It’s usually ‘here’s what I do and I impose it on you — if you like it, use my service; if you don’t then use someone else’s.’”

Beyond concern over whether or not your employer has any connection to data related to your mental health is the likelihood that your app may be giving your user data to third parties, who can in turn use that data for targeting you with ads. This transmission of data is not always disclosed in the app’s privacy policy. To limit how much information you might be passing along, you can take a few actions. For example, the Federal Trade Commission recommends paying close attention to the permissions the app requests, limiting location permissions, and avoiding signing into apps through a social network account. If you have an iPhone or Android, you can also go to your privacy settings and opt out of app tracking (though how well this works is debatable).

Learn more in our *Privacy Not Included mental health apps guide


Podobne