When we last studied TikTok, it left us worried. Sure, the social media app is known for meme videos and dance challenges, but there were underlying problems. For one, political ads. TikTok banned them, yet some TikTok influencers, being paid by politicians, were allowed to spread political messaging. (Luckily our urging led TikTok to release additional methods for experts to study the platform.) Another problem was sponsored content — not the fact that it appeared on TikTok but the fact that it wouldn’t be labeled as such. More recently, new research points to another issue worth putting under a microscope: the types of videos TikTok chooses to suggest to teenagers.
Alongside memes and dances, TikTok’s algorithm plays a huge role in the app’s dominance. By now you know how it goes. You open the app, create an account and then hit like on a bunch of dog videos and basketball clips. “Wow,” says TikTok’s algorithm, “This girl really likes dog videos and basketball clips.” So it suggests more. This leads to TikTok suggesting more videos you’d like. Maybe you even share a few with a couple friends. Suddenly, an hour’s gone by — you don’t know where the time went and TikTok smiles maniacally in the background.
Algorithmic recommendations aren’t always bad, but sometimes they can be. For example, research shows Instagram’s suggestion algorithms have a toxic effect on teen girls — the source of the research is Facebook itself. In the case of TikTok, many worry about how easy it is to surface depression and suicide content. Not just once, but over and over again.
Salvatore Romano is one of those worried people. Salvatore is the head of research at AI Forensics, a two-time Mozilla Technology Fund awardee. We’ve profiled the research group before and talked about the sorts of things the lab studies, like what it would take to actually change your TikTok suggestions. With the help of Amnesty International and Algorithmic Transparency Institute, Salvatore and crew have released research revealing just how bad TikTok’s algorithm can be for depressed teens — maybe even for depressed adults too.
What videos does TikTok recommend to teenage users?
Users can sign up for their own TikTok accounts as young as 13 years old. Salvatore’s research shows exactly what teens are exposed to when they first sign up to use the service. “We used different techniques to experiment with what types of content minors that have access to the platform see, from sock puppet accounts to manually interacting with TikTok to simulate the 13-year-old experience,” says Salvatore. “We found that a 13-year-old TikTok user creating a new account quickly sees problematic content related to depression, mental health and health problems in general.”
Salvatore’s research found that the shift from starting a new account to having that account’s feeds filled with depression videos happens at lightspeed. “We have recorded examples where an account sees anxiety and depression content after only 67 seconds of interaction with the platform,” says Salvatore. “After a few minutes, depending on the interaction, most of the content suggested related to anxiety, depression and even suicide.”
Salvatore notes that during the experiment, the team did not actively seek out depression content in the search bar nor hit “follow” on depressive accounts. Rather, they interacted with the videos TikTok was already suggesting. There were even regional differences amongst the three countries the group studied. “We saw that in the Philippines, entering a depressive content rabbit hole happened much faster than in the U.S. or Kenya,” says Salvatore.
TikTok’s depression content has teens worried too
What do teens think? The ones Salvatore spoke with are worried. “In our focus groups, the kids we spoke realize a large part of their feed, especially when they’re feeling sad and depressed, gets filled with depressive content that can lead to a rabbit hole of suicidal thoughts,” says Salvatore. “They’re concerned with the lack of urgency around how they interact with the platform and what the content they are being suggested.”
TikTok is taking action, says Salvatore, but it may not be enough. “I worry that social media apps, like TikTok and beyond, continue to pursue profit and engagement before safeguarding the people that use these platforms,” says Salvatore. “We’re asking that parents and schools familiarize themselves with the platform so that kids aren’t completely on their own here. We’re also asking that TikTok reconsider its guiding principle. We want TikTok to acknowledge that looking only for profit and engagement will have devastating consequences on the younger generation. We want TikTok to put human rights before profit.”
Want to help Mozilla study TikTok’s algorithm? Look out for our upcoming TikTok Reporter tool, where you can donate your data and help us learn more about the social media app. Coming soon!
TikTok Goes From Funny Memes To Depressing Teens At An Alarming Rate
Written By: Xavier Harding
Edited By: Audrey Hingle, Kevin Zawacki, Tracy Kariuki, Xavier Harding
Art By: Shannon Zepeda