An illustrative image of a mobile phone with a megaspeaker


The paper lays out comprehensive set of fixes to rein in “opaque and harmful” recommendation systems used by Facebook, Twitter, Youtube, and TikTok.


(BERLIN, GERMANY | DECEMBER 7, 2022) -- Mozilla is calling on platforms like YouTube, Facebook, TikTok, and Twitter to publicly disclose how recommender systems work and grant users proper control over the customization of their display feed as part of a comprehensive strategy unveiled today to make recommendation systems more responsible.

Despite growing evidence that platforms’ recommender systems can harm individuals and societies, the debate has focused on content moderation ‘fixes’ rather than one of the root causes: content amplification and the recommendation engines behind them.

Coming at a time when regulators on multiple continents wrestle with how to confront the challenges of AI systems. The paper, titled “Towards Responsible Recommending,” calls for greater oversight, scrutiny, and user control — and details how to make these a reality. The paper focuses on recommender systems that are operated by the largest tech platforms, like Facebook, YouTube, Twitter, or TikTok, and that amplify user-generated content.

The paper also draws on Mozilla’s own research, including the YouTube Regrets 2021 and2022 studies, which demonstrated that YouTube recommended harmful content which even violated its own policies. Further, one of the studies also illustrated that YouTube’s recommendation control buttons, such as ‘dislike’ and ‘do not recommend,’ were ineffective in enabling users to avoid unwanted recommendations.

Says Maximilian Gahntz, Mozilla’s Senior Policy Researcher: “Recommender systems are at the heart of many online services and influence millions of people each day. Yet they remain opaque and removed from public scrutiny: We know too little about how they work and impact our lives. Meanwhile, the way content spreads through these systems can and does lead to genuine harm across the globe, from hate speech to disinformation.”

Recommender systems are at the heart of many online services and influence millions of people each day. Yet they remain opaque and removed from public scrutiny: We know too little about how they work and impact our lives. Meanwhile, the way content spreads through these systems can and does lead to very real harm across the globe, from hate speech to disinformation.

Maximilian Gahntz, Mozilla’s Senior Policy Researcher


Gahntz adds: “Mozilla’s recommendations are not meant to be seen as standalone quick fixes to this problem. But they are complementary strategies that can begin to reign in opaque and harmful recommender systems.”

The paper calls for:

[1] Layered oversight and scrutiny:

  • Disclose publicly how recommender systems work and how they are operated.
  • Provide information on policies guiding content demotion (or reduction).
  • Report and share aggregate data on demoted content.
  • Provide data and tools to analyze the best-performing and engaging content.
  • Provide access to data and documentation to qualified independent researchers.
  • Enable research by civil society organizations, journalists and researchers.
  • Conduct or commission audits of recommender systems and publish the results.

[2] Informed and empowered users:

  • Enable users to customize their recommendation feeds through effective controls.
  • Give users meaningful control over how their data is used to generate recommendations.
  • Allow users to opt out of personalized recommendations.
  • Provide users with easily accessible information explaining why specific content is displayed.

Press contacts:

Europe: Tracy Kariuki, [email protected]
North America: Helena Dea Bala: [email protected]