YouTube is the second most visited website in the world, and its algorithm drives most of the video views on YouTube, more than channel subscriptions or search. The problems with YouTube’s video recommendation algorithm have been well-documented, with claims that the platform promotes incremental radicalization (“rabbit holes”)[1], political extremism[2], and ideological bias.[3]

In its communications, the platform claims that it optimizes its recommendation system for metrics like "user satisfaction” and “time well spent” – not just watchtime. YouTube CEO Susan Wojcicki has emphasized this point in public events. However, evidence suggests that the platform continues to prioritize engagement over people’s well-being. A 2019 paper written by Google engineers[4] describes how the algorithm is designed to balance user engagement with user satisfaction: a tradeoff between promoting content that leads people to spend more time on the site and content the algorithm thinks people will like.

YouTube says that people can control their recommendations and search results through the feedback tools the platform offers. However, many of the stories surfaced through Mozilla’s 2019 YouTube Regrets campaign suggest that people continue to see unwanted videos despite having followed the steps prescribed by YouTube to avoid them. In our own 2021 investigation into YouTube’s recommender system, we heard from people that they do not feel in control over their experience on YouTube, nor do they have clear information about how to change their recommendations.

Mozilla’s vision for Trustworthy AI (such as recommendation algorithms) is that people have meaningful control over these technologies. When a platform has poorly designed controls — or controls that do not perform at all — people feel disempowered and helpless. In collaboration with Mozilla, the organization Simply Secure mapped and analyzed YouTube’s controls in 2021 to understand if the platform’s design supported user experience principles of control, freedom, and transparency. Their analysis of those controls, “Dark Patterns in User Controls: Exploring YouTube’s Recommendation Settings,” determined that YouTube’s user controls do not appear to be designed with people’s well-being in mind. They concluded that:

  • YouTube’s existing user controls are reactive and not proactive, leaving people to catch-up to the recommendation engine rather than designing what they want to see.
  • Options to “teach” the YouTube algorithm are few and limited in scope.

Ultimately, the controls on YouTube are reactive tools: They don’t empower people to actively shape their experience on the platform.

To evaluate the effectiveness of YouTube’s controls for people who use the platform, we carried out a study that leverages Mozilla’s large community of RegretsReporter volunteers. 22,722 people donated their data to Mozilla, generating a dataset of the 567,880,195 videos they were recommended. This study represents the largest experimental audit of YouTube by independent researchers, powered by crowdsourced data.

To understand whether people feel in control, Mozilla surveyed 2,757 RegretsReporter participants to better understand their experiences with YouTube’s recommendation algorithm.

Ultimately we wanted to learn whether people feel in control on YouTube – and whether those experiences are actually validated by our RegretsReporter data. By combining quantitative and qualitative insights in this research project, we aim to paint a more complete picture of how YouTube’s recommendation algorithm handles user feedback.


Fußnoten

  1. [1]

    Zeynep Tufekci, “YouTube’s Recommendation Algorithm Has a Dark Side,” Scientific American (April 1, 2019), https://www.scientificamerican.com/article/youtubes-recommendation-algorithm-has-a-dark-side/.

  2. [2]

    Becca Lewis, “Alternative Influence,” Data & Society (Data & Society Research Institute, September 18, 2018), https://datasociety.net/library/alternative-influence/.

  3. [3]

    Muhammad Haroon et al., “YouTube, The Great Radicalizer? Auditing and Mitigating Ideological Biases in YouTube Recommendations” (arXiv, March 24, 2022), https://doi.org/10.48550/arXiv.2203.10666.

  4. [4]

    Zhe Zhao et al., “Recommending What Video to Watch next: A Multitask Ranking System,” in Proceedings of the 13th ACM Conference on Recommender Systems, RecSys ’19 (New York, NY, USA: Association for Computing Machinery, 2019), 43–51, https://doi.org/10.1145/3298689.3346997.