Many of our survey participants said they took a trial-and-error approach to curating their recommendations, with limited success. When people feel that they need to log out, create new accounts, or use privacy tools just to manage their YouTube recommendations, it not only ruins the overall user experience, it also erodes people’s confidence in their ability to make meaningful choices on the platform.

YouTube should respect the feedback users share about their experience, treating them as meaningful signals about how people want to spend their time on the platform. Here are our recommendations for YouTube and policymakers:


1. YouTube’s user controls should be easy to understand and access.


People should be provided with detailed, accurate information about the steps they can take to influence their recommendations.

In our study, we learned that YouTube’s user controls have varying levels of effectiveness, but people don’t always understand how those controls influence the recommendations they see. YouTube already provides a general overview of information, but it should provide more detailed information about what data points or signals influence recommendations, including what specific third-party data YouTube uses to inform a person’s recommendations.

Importantly, YouTube should also help people understand what effect each control will have on their recommendations within the product, not just in an overview page. The tools themselves should use plain language that tells you what will happen when you use the control (e.g. “Block future recommendations on this topic”) rather than the signal the control will send (“I don’t like this recommendation”).

YouTube should help people understand specifically what each of these controls do by providing clear, in-product explanations. For instance, through our study we learned that clicking “Don’t Recommend Channel” sends a stronger signal than “dislike,” but neither was completely effective at preventing unwanted recommendations. In order to help people understand what each of these controls do, for instance, YouTube could explain:

Pressing ‘Don’t Recommend Channel’ will reliably reduce recommendations from this channel, but content from similar channels might still appear in your recommendations.

Under the terms of the EU Digital Services Act (DSA), enhanced user transparency will soon become a legal obligation for YouTube, requiring the platform to explain how the recommendation algorithm prioritizes and displays content, including information about the algorithm’s parameters.

But this level of transparency should be the bare minimum: General policy pages like “How YouTube Works” are useful to the general public but need to be accompanied by in-product tools and explanations that help people understand what kind of control they can exercise. These kinds of product changes would require rigorous user research and testing to ensure explanations are meaningful and useful to people, but would ultimately empower people to make more informed choices about their recommendations.


People should be directed to documentation and tools through more intuitive user pathways.

Simply Secure’s mapping of user controls determined that managing recommendations on YouTube is a mess: false settings that don’t do much, convoluted and confusing pathways, with multiple pages and pop-ups to consult for guidance. In some cases, the user pathways are circular, leading people back to pages they’d previously visited without providing additional clarity. YouTube has since made some improvements: a centralized hub where users can take action and manage their account settings and privacy.

YouTube should continue to assess the pathways users take to access controls and settings and redesign the user experience accordingly. These paths should be designed in a way that centers people’s lived experiences, aimed at enhancing their autonomy and control. People should also be able to provide more detailed feedback about why they don’t want to see certain videos that goes beyond just clicking a button.


2. YouTube should design its feedback tools in a way that puts people in the driver’s seat.

As YouTube works towards giving users better information about each control, it should also ensure that those controls actually shape recommendations. We suggest YouTube overhaul its existing controls in favor of better feedback tools that enable people to make informed, active choices about their experience on the platform.


User controls should give people more control over what they see.

Many people we surveyed said they simply wanted to block certain kinds of channels and videos from future recommendations, but our study determined the current controls do not effectively do this. Concerningly, our study demonstrated that even after using YouTube’s user controls, people continued to see videos that may violate the platform’s community guidelines, or videos potentially considered “borderline” – videos that don’t violate YouTube’s policies, but that a broad audience might not want to see. In our dataset of recommended videos, we observed that RegretsReporter users were recommended a number of videos that could fall under YouTube’s policies on firearms, graphic content, or hate speech. Not only does this suggest that YouTube is struggling to enforce its own content policies, but that the platform is actively recommending this content to users even after they’ve sent negative feedback.

At a minimum, user controls should be reasonably effective at preventing recommendations of videos on a particular topic or by a particular channel. People should be able to exclude specific keywords, types of content, specific accounts, or other criteria from their recommendations. YouTube should explain how each of these categories of content is defined so that people can make informed choices.


User feedback should be given more weight in determining how videos are recommended.

Over the course of doing this research, we learned that people’s intentions for how they want to spend time on the platform might not line up with their behavior. There are many reasons people watch videos they do not want to be recommended in the future: Participants described watching videos that were “guilty pleasures,” being drawn into “clickbait” videos, or accidentally clicking on a video with millions of views. People should have the ability to actively shape their recommendations in line with their interests and/or wellbeing.

YouTube should value and respect the feedback users send about their experience on the platform, treating them as meaningful signals about how people want to spend their time on YouTube. For instance, YouTube could design a more collaborative recommendation system in which user feedback is treated as the most important signal, above watch time or engagement. YouTube should consider overhauling its existing user controls in favor of feedback tools that actually put people in the driver’s seat.


YouTube’s feedback tools should enable people to more proactively design their YouTube experience.

Research into recommender systems has shown that people feel more satisfied when controls are paired with the ability to make meaningful choices about the recommendation algorithm. The platform should overhaul its current user controls, and test alternative feedback tools that would allow people to more proactively design their YouTube experience. For instance, YouTube could design a feedback model that enables people to specify their interests and disinterests, including their preferences for subject matter, format, or diversity of recommendations.

Designing better feedback mechanisms could be a win-win for both users and YouTube: Rather than interacting with a paternalistic platform that makes choices on their behalf, people would have a greater say over what they see, and YouTube wouldn’t have to rely as heavily on passive data collection in order to infer what people want to watch (for instance, by using engagement data to approximate a user’s preferences). Simply put, YouTube could be “asking people what they want instead of just watching what they do.”

YouTube may soon be legally required to make some of these changes. According to a DSA citation, online platforms will need to take steps to ensure its users can influence how information is presented to them. But our recommendation goes further: YouTube should reimagine how user feedback is given and interpreted on the platform. Feedback tools should empower people to take an active role in designing and curating their YouTube experience – rather than relying on reactive tools like the “dislike” button and hoping it sends the right signal.


3. YouTube should enhance its data access tools.

Mozilla built RegretsReporter as a crowdsourced research platform because it is very difficult to access the kind of data needed to study YouTube’s algorithm. To run this study, we relied primarily on crowdsourced data collected from RegretsReporter users as well as automated web requests to acquire video metadata from YouTube. The reality is that YouTube’s API has its limits: Critical classes of data are not made available, and rate limits prevent large scale data collection. What’s more, YouTube does not provide other kinds of audit tools for researchers to assess harm on the platform.


YouTube should give researchers access to platform data so that they can adequately scrutinize the platform.

YouTube should ensure that the tools and systems it builds enable large-scale analysis of platform content. At a minimum, YouTube should improve its public-facing API so that independent researchers, journalists, and the general public can have greater access to platform data, in a way that’s privacy-protecting.

Researchers need to be able to access the content hosted on YouTube in order to adequately diagnose problems on the platform. Vetted researchers should be given access to platform data that goes beyond what is offered in the public API, such as public platform content, users, or pages on the platform, in a way that’s in line with data protection rules and principles. The European Digital Media Observatory (EDMO) has proposed a draft code of conduct on how platforms can give researchers access to data while protecting user privacy, in compliance with the General Data Protection Regulation (GDPR).

Mandating researcher access to platform data was one key outcome of the EU Digital Services Act (DSA), legislation which was adopted by the European Parliament in July 2022. Under the DSA’s article on “Data Access and Scrutiny”, vetted independent researchers must be given the tools they need to investigate and run experiments on recommender systems as they relate to platform harms or “systemic risks”. While the details of who will get access and what data will be made available are still being worked out, platforms will need to expand and improve their data access tools.

On the heels of the DSA’s adoption, YouTube announced its release of a new researcher-facing API. The researcher API seems to provide the exact same data that was already available through its public API, but with increased rate limits on a case-by-case basis. In the context of our study, access to this API would have saved a lot of time and trouble, since we had to rely on alternative tools for acquiring video metadata due to the API’s current rate limit.

However, the researcher API would not have helped with much else in this study. The data we analyzed — what videos are being watched, what is being recommended, and how people interact with the platform — is not data that would be made available with this new researcher API. In addition, under the terms of the researcher API program, only researchers at accredited and vetted academic institutions will be eligible for accessing the API. This means that journalists and independent researchers like Mozilla’s would not be able to use this researcher API.

As YouTube works towards enhancing its data access tools, it’s important that it ensures both the researcher-facing and public API are designed to help civil society actors assess the risks and harms on the platform.


YouTube should build tools that enable researchers to carry out audits and experiments on the platform.

In addition to ensuring researcher access to data, YouTube should build tools that enable researchers to audit and run tests on the platform. Independent audits and experiments are essential to holding platforms accountable. Such efforts may be voluntary on YouTube’s part, but policymakers should consider mandating the release of such tools.

YouTube should provide tools that simulate user pathways or recommendation pathways. One of the challenges researchers face when studying YouTube is that personalization is very difficult to study without access to real user data. Simulation tools would allow researchers to tweak inputs and parameters in order to run experiments and tests on YouTube’s recommendation algorithm. In the context of this study, simulation tools could have helped us test how different user controls impact what videos the recommendation algorithm serves, looking at a range of different variables.


4. Policymakers should protect public interest researchers.

Mozilla researchers relied on a crowdsourced approach to studying YouTube for this study, working with tens of thousands of volunteers who offered up their data to support the research. Without access to data from real people about what they are experiencing, it’s impossible to analyze the spread and impact of algorithmic harms on the platform. As previously discussed, we recommend YouTube provide greater access to platform data for researchers – whether such moves are voluntary or mandated by regulators, under legislation like the DSA.

However, there are not clear legal protections in place for independent researchers who are conducting good-faith research in the public interest about tech platforms. Institutes like New York University’s Ad Observatory that are carrying out critical research into ad targeting on Facebook have had their access to Facebook data revoked, in spite of the fact that their crowdsourced approach does not violate user privacy.

Policymakers should act to pass laws that provide legal protections for public interest research. Such protections should be written into platforms’ terms of service agreements and encoded in law. They should protect independent researchers, civil society organizations, and journalists conducting research, as long as the research complies with data privacy laws and research ethics standards.

These protections should complement data access tools by protecting research that doesn’t rely only on platform access tools. As we’ve demonstrated with this study, there are research questions that the data from YouTube’s API doesn’t currently answer. If we had waited for YouTube to release the data we need, we would never have been able to carry out this audit. As problems rapidly emerge and scale on platforms, it’s critical that researchers have legal protections to carry out research, as long as it complies with privacy principles.

Such amendments and/or research exceptions could stipulate that platforms cannot actively block research tools or impose rate limits that are designed to prevent researchers from carrying out their work. Such protections should apply to researchers who are collecting crowdsourced data from platform users, scraping data from platforms, or conducting sock puppet audits – methods that in many cases violate platforms’ terms of service.

In any case, clarity around the legality of public interest research is urgently needed. The current lack of legal protections stifles legitimate public interest research: Researchers worry that platforms may take legal action against them, or they may avoid research projects altogether that violate platforms’ terms of service. Worryingly, researchers are deterred from conducting research when they don’t have institutional support, or can otherwise assume legal risk. The current uncertainty around platform research further entrenches platform power and stifles critical public interest research.


Conclusion

In our research, we determined that YouTube’s user controls do not work for many people. Our analysis of RegretsReporter data revealed that these user controls don’t seem to have a major effect on how videos are recommended. We heard from several people who expressed frustration with the user controls, and who said they wanted better tools that simply work the way they’d expect them to. Importantly, the tools themselves are largely reactive and they don’t empower people to actively design how videos are recommended.

YouTube should make major changes to how people can shape and control their recommendations on the platform. YouTube should respect the feedback users share about their experience, treating them as meaningful signals about how people want to spend their time on the platform. YouTube should overhaul its ineffective user controls and replace them with a system in which people’s satisfaction and well-being are treated as the most important signals.

Keep Scrolling For
About the Methodology