This is a closing profile of Mozilla Technology Fund awardee TikTok Observatory and their work over the past year.


From Bella Poarch’s M to the B lip-sync to Zach King’s Harry Potter illusion to Khaby Lame’s How to open a car door video, TikTok has cemented its place as an unrivaled platform for entertainment.

But TikTok isn’t just a happy-go-lucky platform. In 2020, The Intercept exposed TikTok for its “invisible censoring” of content, also known as shadow banning. Content by people deemed to be “poor,” “ugly,” or harming the “national honor" of China was being suppressed, failing to appear in the platform’s “For You” feed.

Algorithm watchdog and Mozilla Technology Fund awardee Tracking Exposed investigates influential and opaque recommender systems like TikTok. In the past, they have researched Pornhub, YouTube, and Facebook. And when Tracking Exposed first decided to research TikTok via TikTok Observatory, their aim was to look into this shadow banning, especially with regard to political content.

“The TikTok Observatory developed the infrastructure needed to investigate recommended and demoted content on the ‘For You’ feed — that was the original idea,” explains Marc Faddoul, Co-Director of Tracking Exposed.

Why TikTok? Other than it being the new kid on the block, Fadddoul says TikTok is under-researched — and that the platform works to keep it that way. On platforms like YouTube and Facebook, for example, an Application Programming Interface (API) allows researchers to mine platform activity data, even though it is limited. TikTok does not allow this.

The TikTok Observatory developed the infrastructure needed to investigate recommended and demoted content on the ‘For You’ feed.

Mark Faddoul, TikTok Observatory

And so researching shadow banning was not simple. TikTok’s recommender system, a key part of the investigation, is highly personalized and dependent on location, time, and other variables — which made finding patterns difficult. Faddoul says they needed to design a clever way to control the experiment using many different accounts.

Although shadow banning related to Chinese politics is what they intended to investigate, the war between Russia and Ukraine presented an opportunity to look into what content was being recommended to audiences in those countries, too.

What they found was more surprising than they expected — and resulted in three reports.

“The first report was basically exposing the fact that TikTok had blocked international content for all its users in Russia, which was quite a dramatic move at the time,” Faddoul says. “TikTok — especially at the beginning of the war — was one of the few places where there was vocal dissent against the Kremlin in Russia, and where information was still flowing rather freely.”

According to the report, “TikTok had announced that it would not allow new content uploads in response to the Russian law that made spreading ‘fake news’ about the Russian army a crime punishable by 15 years in prison, but TikTok did not announce that it has banned all foreign accounts for users in Russia.”

“It was only after our investigation, as journalists started talking to them, that TikTok acknowledged that they had done this,” Faddoul says.

Even though TikTok stated that no new content would be allowed on the platform in Russia, there was a loophole. The second report looked into what kind of content was getting through.

“It turned out that before the ban, the majority of the content was against the war and supporting the Ukrainian side in Russia,” Faddoul says. “However after the ban, only pro-Kremlin content would go through. And so we saw how this ban not only limited the ability for dissidents to voice their concerns or their perspective on TikTok, but also enabled propaganda to continue.”

The third and final report was perhaps the most surprising, addressing a whole new phenomenon: Content that appeared as blocked when you visited a creator's page was still being promoted on the ”For You” feed.

“This was something unprecedented which we called ‘shadow promotion’ as opposed to shadow banning,” Faddoul explains. “Here was the opposite. You had content that appeared to be banned on the platform if you accessed it directly, but in fact it was still being promoted by the recommendation system.”

According to the report, shadow promotion was a new finding and did not happen at the time of the first and second investigations.

“This latest report into shadow promotion calls for increased scrutiny and transparency for TikTok, especially over its role in Russia in light of the ongoing war in Ukraine,” the report reads. “It also explains how TikTok’s transparency efforts fall short of the international standards set by the EU Digital Service Act and the upcoming EU AI Act.”

Given the findings from these three reports, it’s more clear than ever that TikTok needs to open up its platform to researchers.

“TikTok has shown consistently that they are not being transparent about their policies,” Faddoul says. “This justifies once again the need for external scrutiny, but we wish they were also more cooperative and outspoken.”

Faddoul is hoping legislation will make things more transparent.

“There is no way to investigate the app quantitatively or to analyze the recommender system,” he explains. “The Digital Services Act (DSA) might force the platform to put an API in place, but the details of the data that will be provided and the conditions to access are not clear.”

What’s next? In 2023, Tracking Exposed will be working on making its technology more robust so that other investigators and organizations can use it.

“We believe that the most efficient way to increase public scrutiny of TikTok is to develop and distribute open-access technology enabling a diverse range of perspectives represented in the research,” Faddoul says.

The team also plans to host workshops training people to use the technology, particularly among less represented communities. They will also create an interface that shows real-time trending content across the globe.