This is a profile of the TikTok Observatory, a Mozilla Technology Fund awardee.


TikTok is often perceived as a rosier tech platform, largely free from the misinformation and controversy that plague peers like Facebook. But in 2020, that reputation started to change: The Intercept published a leaked document about the platform’s moderation and algorithmic practices.

“TikTok said certain political topics should be censored,” recalls Marc Faddoul, a France-based researcher studying social media recommendation systems and algorithmic transparency. There were also more shocking policies in place, Faddoul says, “like demoting content by people who appear ‘ugly’ or ‘poor’.”

TikTok claimed the rules were outdated, “but there hasn’t been any systematic research to confirm whether these practices are still happening,” Faddoul cautions. “Despite being huge, TikTok is much less on the radar of the research community.”

But that’s starting to change — and Faddoul is helping make that change happen.

Despite being huge, TikTok is much less on the radar of the research community.

Marc Faddoul, TikTok Observatory

Faddoul is heading up the TikTok Observatory, a new research initiative and Mozilla Technology Fund awardee scrutinizing what trends — and what doesn’t — by way of Tiktok’s algorithm and moderation policies.

The Observatory uses a mixture of crowdsourcing, scraping, and bot technologies to gather insight, instead of relying on TikTok’s API. “This allows us to run more controlled experiments, including with real users who can download a browser extension and share their data,” Faddoul says.

All of the TikTok Observatory's tools are “free and open source software,” he adds. “This is a really complex and time consuming thing to maintain, [so] we want to promote collaboration in the research community and empower other people to carry out these investigations.”

Right now, the Observatory’s focus is the demotion of political content in non-English speaking countries. For example, the Observatory is tracking the performance of content that’s sensitive to the Chinese Communist Party (CCP), especially content that appears abroad. “Douyin, the Chinese version of TikTok, is already heavily moderated — that’s a known fact,” Faddoul says. “The question is whether the CCP uses its influence on ByteDance [the owner of TikTok] to censor or shape political discourse abroad.”

As Faddoul and his colleagues make progress, platforms will often change course to remain opaque. “We have already noticed that some of the moderation processes have evolved,” he says. Last year, Faddoul was investigating hashtags in Russia critical of Vladimir Putin. At first, these hashtags were outright blocked in TikTok’s search function. Now they are no longer blocked, but may still be demoted.

“This does not mean that there’s no political influence happening on TikTok, “ Faddoul explains. “It’s just harder to detect.”

The TikTok Observatory screenshot
The TikTok Observatory is a new research initiative and Mozilla Technology Fund awardee.


Indeed, the Observatory has determined there are multiple layers of moderation inside TikTok, from banning content outright, to preventing it from showing up in the influential For You feed. “Most of the influence can happen between these two levels of moderation,” Faddoul explains. “If something doesn’t show in the For You feed, it will never reach a large audience.”

For this reason, the Observatory is designing methodologies for determining what content appears on the For You page and what doesn’t. It’s hard work: “You can’t prompt a specific content to appear,” Faddoul says. “But [if tested] at scale, you can determine if a certain type of content never shows up.”

The TikTok Observatory is part of Tracking Exposed, a collective founded by Claudio Agosti to investigate social media platforms and seek greater algorithmic transparency. Tracking Exposed is also investigating Facebook, YouTube, Amazon, and PornHub.

Faddoul first learned about Tracking Exposed about two years ago, when he was working with Dr. Hany Farid at UC Berkeley to investigate YouTube’s algorithm. “We decided to join forces,” Faddoul said. “To expand the infrastructure, to find more research partnerships, and to build a social media observatory whose focus was algorithmic amplification and demotion.”

Faddoul believes the TikTok Observatory, Tracking Exposed, and similar projects have a crucial role to play in today’s digital world. While some people are optimistic about a policy solution, “that might take a couple years, and we don’t know what it looks like,” Faddoul says.

“I don’t think anything will ever replace the need for independent organizations to scrutinize how these platforms are behaving,” he adds.