YouTube is the second most visited website in the world, and its algorithm drives most of the video views on YouTube. Previous Mozilla research determined that people are routinely recommended videos they don’t want to see, including violent content, hate speech, and political misinformation.
YouTube says that people can manage their recommendations and search results through the feedback tools the platform offers, but we heard from people that they do not feel in control over their experience with the YouTube algorithm. We surveyed 2,757 participants about their feelings of control in relation to the platform and we learned that many people feel their actions don’t have any effect on YouTube recommendations.
To test whether these experiences are backed by data, we evaluated the effectiveness of these controls for real users of the platform. Powered by Mozilla’s research tool RegretsReporter, 22,722 people donated data about their interactions with YouTube. This study represents the largest experimental audit of YouTube by independent researchers, powered by crowdsourced data.
We looked at what happened over time to people’s recommended videos after they had used one of YouTube’s feedback tools – buttons like “Dislike” and “Don’t Recommend Channel.” From Dec 2021 to June 2022, RegretsReporter participants shared 567,880,195 video recommendations with us. In collaboration with researchers from the University of Exeter, we used a machine learning model we built to analyze video similarity. Through this approach, we were able to study what kind of effect YouTube’s tools have on video recommendations for real users of the platform.
In this report, we describe what we learned from our research using RegretsReporter data. Through complementary qualitative and quantitative studies, we determined that:
- People feel that using YouTube’s user controls does not change their recommendations at all. We learned that many people take a trial-and-error approach to controlling their recommendations, with limited success.
- YouTube’s user control mechanisms are inadequate for preventing unwanted recommendations. We determined that YouTube’s user controls influence what is recommended, but this effect is negligible and most unwanted videos still slip through.
In the report, we provide some examples of videos that were recommended after RegretsReporter participants used YouTube’s feedback tools. For example, one participant asked YouTube to stop recommending firearm videos — but was shortly after recommended more gun content. Another person asked YouTube to stop recommending cryptocurrency get-rich-quick videos, but then was recommended another crypto video.
In this report, we also provide a set of recommendations to both YouTube and policymakers. This guidance includes:
- YouTube’s user controls should be easy to understand and access. People should be provided with clear information about the steps they can take to influence their recommendations, and should be empowered to use those tools.
- YouTube should design its feedback tools in a way that puts people in the driver’s seat. Feedback tools should enable people to proactively shape their experience, with user feedback given more weight in determining what videos are recommended.
- YouTube should enhance its data access tools. YouTube should provide researchers with access to better tools that allow them to assess the signals that impact YouTube's algorithm.
- Policymakers should protect public interest researchers. Policymakers should pass and/or clarify laws that provide legal protections for public interest research.
This report also includes details about our research questions, methodology, and analysis for both our qualitative and quantitative studies.