social-media

TikTok. It’s the app your parents aren’t on, the one that your government has maybe tried to ban, and the one that NYT internet culture reporter Taylor Lorenz allegedly spends an hour a night scrolling through.

Despite estimates placing TikTok sixth on the list of the world’s most-used social platforms (thanks in part to the app’s sophisticated and highly-addicting AI), it’s received little scrutiny from the digital rights community relative to other apps of its size and popularity.

But that doesn’t mean that it’s laying low: If there’s one thing that TikTok has shown it knows how to do, it’s staying ahead of the trend. And in this case, that trend is the ever-growing debate between consumer groups, privacy activists, policymakers, and tech industry professionals over issues like algorithmic accountability and content moderation on social platforms.

Over the past year, TikTok has made major strides towards establishing a reputation for transparency and accountability. In June 2020, the company published a blog post detailing how it uses AI to deliver recommendations. The following month it announced that it would open up its algorithms and content moderation policies to the public through a new, (virtual) Transparency and Accountability Center. The company also began publishing a quarterly Transparency Report that provides detailed information about community guidelines, fact-checking features, safety initiatives, and law enforcement requests for data. An analysis of LinkedIn profiles suggests that the company boasts 990+ employees working on public policy issues and government relations globally.

In just a few months – and against a backdrop of geopolitical issues threatening the app’s parent company ByteDance – TikTok has voluntarily disclosed information about its app that had taken years of campaigning (and some soft regulation) to get from companies like Facebook and Google.

It’s promising to see, but it still begs the question: Will TikTok move beyond the trend and work to establish genuine norms of transparency and accountability? As always, the devil is in the details. That’s why we’ve put together this summary of what we think the platform gets right – and where we still have questions.

The good (what seems promising)

At Mozilla, we believe that algorithmic accountability is critical to a healthy and open information ecosystem. We’ve pressured YouTube to open up its recommendation algorithm and demanded that social media platforms like Facebook release advertising archives to shed light on ad targeting and spend.

As platforms like Facebook and YouTube struggle to explain how their News Feed and recommendation algorithms work, TikTok seems to be moving towards greater transparency by saying it will open up its platform to researchers. This is the right move – until researchers have access to comprehensive data about these platforms’ algorithms, they cannot identify patterns of harm and abuse, fueling public distrust.

TikTok also proactively joined the EU Code of Practice on Disinformation as well as the European tech industry association DOT Europe earlier this year (full disclosure: our subsidiary, Mozilla Corporation, is also a member of DOT Europe).

At this early stage it's hard to say what this all means. On the one hand, it is encouraging to see these proactive signals. But on the other hand, words need to be followed up with action. We’re encouraged by the constructive conversations we’ve already had with TikTok and hope to build on them to move towards greater accountability.

The bad (what we’re worried about)

Compared to the efforts of platforms like Facebook, Google, Twitter and Snapchat, TikTok has provided virtually no transparency into advertising on its platform. The company says that it has banned political advertisements from its platform and therefore transparency requirements under the EU Code of Practice of Disinformation do not apply. However, TikTok says that it permits ads from government agencies or nonprofits that are “not driven by partisan political motives,” though it does not provide information about how it determines whether an advertiser’s motive is partisan.

In addition, TikTok prohibits ads that contain “harmful misinformation, political content, or discriminatory content.” However, these categories are broad and TikTok does not offer details on how it defines them. Critically, TikTok does not currently offer any kind of public ad archive that would allow researchers to see what kinds of ads do run and how such rules are being applied in practice.

In the absence of any transparency into advertising on the platform, it is impossible to assess how TikTok is enforcing these policies. Despite TikTok’s ban on conspiracy theories surrounding the COVID-19 pandemic, an investigation from Media Matters, a US-based nonprofit, recently identified multiple videos advocating “baseless claims and debunked conspiracy theories” about the coronavirus, raising concerns about TikTok’s abilities to enforce such broad policies at scale across their platform.

The unclear (things we don’t know)

Chief among TikTok’s commitments to the EU Code of Practice on Disinformation is its commitment to “working with experts to understand challenges on their platform and improve their policies.” We are curious to learn more about how TikTok will do this on an ongoing basis, and the ways in which its efforts may deviate from other industry actors in the social media space. Key questions we still have are:

  • How will TikTok support independent research?
  • Does the platform have Terms of Service exceptions for researchers?
  • TikTok has said that one of its goals is to allow experts to “audit the algorithm’s source code.” How does TikTok define audits? Will TikTok commit to independent audits, including civil rights audits?
  • In terms of TikTok’s Transparency Center, is there an objective assessment criteria for researcher applications or other access requirements? What kind of conditions are attached to research findings from the Center?
  • How will TikTok provide ongoing transparency into the enforcement of its content moderation and advertising policies?

These are just some of the questions we are left with as we consider how TikTok will approach independent research through its new Transparency and Accountability Center. We’re hopeful we’ll get more insight in the coming weeks and months.

Platform accountability is critical to a healthy and open information ecosystem and it’s absolutely critical that platforms like TikTok get transparency right. By opening up their platform to independent researchers and audits, engaging civil society organizations and communities, and releasing data to the public about enforcement of their policies around content moderation and advertising, we think TikTok can make great strides towards accountability. But we won’t know how effective these efforts will be until we get more details.

Further reading

TikTok's Transparency and Accountability Center

TikTok's Transparency Report

TikTok’s commitments based on the EU code of practice (download)

Mozilla’s Election Tracker


Related content