Headspace says their mission is to improve the health and happiness of the world. Founded by a former monk who also seems to have a love for the circus, Headspace offers guided meditation and mindfulness tips as well music from John Legend to help you fall asleep. This popular app -- the company claims over 70 million members in 190 countries around the world -- says it wants to be 'your mind's best friend," which sure does sound nice. Seems your mind's best friend also might like to collect and share your data with places like Facebook and Google though so maybe hold off on that BFF label for now.
What could happen if something goes wrong?
Headspace does collect some personal data on users, including -- name, e-mail address, phone number, devine infomation, app usage data, and Facebook ID -- although seemingly not as much as some of the other mental health apps we reviewed. Still, they collect your personal and app usage information and say they can share that information with third parties like Google and Facebook to target advertising and make personalizations within the app. Unfortunately, this is pretty standard practice for apps like this in our current data economy. Still, it's not great if you care more deeply about your privacy.
Does it matter if Facebook knows when you use a meditation app, if Google knows where you use the app, or if Headspace knows you're searching for a meditation to help you prepare for a big exam? It could. What's the worst that could happen? Well, here's a blog from Headspace talking about how they are developing machine learning applications to use your data in real-time to offer "recommendations that engage our users with new relevant, personalized content that builds consistent habits in their lifelong journey." New, consistent habits could also be interpreted as keeping users using the app as much as possible. One idea they mention in this post is to use users' biometric data such as steps or heart rate to then recommend in real-time more content to get them moving or exercising. OK, that's not necessarily a bad thing. But what more could a company potentially learn about you and do with that data? Perhaps learn your emotional state? Your anxiety level? And then target more content or advertising to you when you're more vulnerable? We don't think this is currently happening with Headspace. But the title of this section is "what could happen if something goes wrong?" And it seems, now or in the future, there are many things that could go wrong with this sort of vast data collection, processing, and the personalization and ad targeting that follows.
All in all, Headspace isn't the worst meditation app we reviewed. It does collect a good amount of data though, shares some of that data with third parties for things such as targeted advertising, and seems to be looking to use more of that data to keep you on the app as much as possible. Like we said, unfortunately, this is the norm for apps in our current data economy. The question is, should it be?
What can be used to sign up?
Facebook, Apple, Spotify and Google can be used for a sign-up
What data does the company collect?
Name, email, and in certain instances, telephone number
Facebook profile information, such as name, email address, and Facebook ID, if you choose to log in to the Products through Facebook.
How does the company use this data?
How can you control your data?
What is the company’s known track record of protecting users’ data?
No known privacy or security incidents discovered in the last 3 years.
Child Privacy Information
Can this product be used offline?
You can download content to use it offline.
User-friendly privacy information?
Links to privacy information
Does this product meet our Minimum Security Standards?
Headspace offers personalised and real-time suggestions to guided meditation
Is this AI untrustworthy?
What kind of decisions does the AI make about you or for you?
Is the company transparent about how the AI works?
Does the user have control over the AI features?
Got a comment? Let us hear it.