YouTube image

Mozilla has built a browser add-on that transforms YouTube users into YouTube watchdogs


YouTube recommendations can be delightful, but they can also be dangerous. The platform has a history of recommending harmful content — from pandemic conspiracies to political disinformation — to its users, even if they’ve previously viewed harmless content.

Indeed, in October 2019, Mozilla published its own research on the topic, revealing that YouTube has recommended harmful videos, including misinformation. Amid a global pandemic and with an election looming in the U.S., the consequences of this problem are real and growing.

Despite the serious consequences, YouTube’s recommendation algorithm is entirely mysterious to its users. What will YouTube be recommending that users in the U.S. watch in the last days before the election? Or in the following days, when the election results may not be clear?

So today, Mozilla is giving YouTube users a way to take action when they are recommended harmful videos. Today we are launching the RegretsReporter extension: A browser extension to crowdsource research into YouTube’s recommendation problem.

By sharing data about YouTube regrets with Mozilla, people can help us better understand this issue — and help illuminate the right path forward.

YouTube image

Download the extension at mzl.la/regrets-reporter, and learn more about the project below.

Why we’re doing this

YouTube’s recommendation AI is one of the most powerful curators on the internet. YouTube is the second-most visited website in the world, and its AI-enabled recommendation engine drives 70% of total viewing time on the site. It’s no exaggeration to say that YouTube significantly shapes the public’s awareness and understanding of key issues across the globe.

For years, people have raised the alarm about YouTube recommending conspiracy theories, misinformation, and other harmful content. One of YouTube’s most consistent responses is to say that they are making progress on this and have reduced harmful recommendations by 70%. But there is no way to verify those claims or understand where YouTube still has work to do.

With each day that we wait for YouTube to verify these claims, videos like “Civil War Is Coming” or “Plandemic” could be pushed to more and more people. That’s why we’re recruiting YouTube users to become YouTube watchdogs. People can donate their own recommendation data to help us understand what YouTube is recommending, and help us gain insight into how to make recommendation engines at large more trustworthy. Together, we can help uncover important information like:

  • What type of recommended videos lead to racist, violent, or conspiratorial content?
  • Are there patterns in terms of frequency or severity of harmful content?
  • Are there specific YouTube usage patterns that lead to harmful content being recommended?

With this information, Mozilla — along with fellow researchers, journalists, policymakers, and even engineers within YouTube — can work towards building more trustworthy systems for recommending content.

In open-source fashion, Mozilla will share findings from our research publicly and hope that YouTube and others will use this information to improve their products. Stay tuned to foundation.mozilla.org/blog for updates.


How it works

As you browse YouTube, the extension will automatically send data about how much time you spend on the platform, without collecting any information about what you are watching or searching. This may give us some insight into how often users are experiencing YouTube Regrets. If you specifically choose, you can send us a report. The report form will ask you to tell us more about your YouTube Regret, and collect information about the video you are reporting and how you arrived at that video. This is important for understanding how YouTube's recommender system keeps people watching and what effects it might have on them.

Throughout this process, your privacy is paramount.

The data Mozilla collects is linked to a randomly-generated user ID, not your YouTube account. Only Mozilla will have access to the raw data, and when we share results from the extension we will disclose it in a way that minimizes the risk of users being identified. The extension gathers no data when you browse in a private browsing window. For more detailed information about your privacy, please read the RegretsReporter Privacy Notice.

We ask users not to modify their YouTube behavior when using this extension. Don’t seek out regrettable content. Instead, use YouTube as you normally do. That is the only way that we can collectively understand whether YouTube’s problem with recommending regrettable content is improving, and which areas they need to do better on.


Our past work on this topic

Mozilla is fueling the movement for a healthy internet and trustworthy AI. We do that through projects like this, which hold companies accountable when their AI harms people.

We’ve been pushing YouTube to address its recommendation problem for over a year. Below, find a timeline of Mozilla’s work on this issue.

July 2019

Mozilla awards Guillaume Chaslot, a former YouTube engineer, a fellowship to support his work investigating AI systems, including the impacts of YouTube’s recommendation engine.

August 7, 2019

Mozilla announced our intentions to investigate YouTube’s recommendation problem, and to begin a dialogue with the company about making improvements.

August 14, 2019

Mozilla published a reading list for our community detailing the ways recommendation AI — including YouTube’s — can manipulate consumers.

September 3, 2019

Mozilla hosted a Q&A with a teenager who was radicalized online, partially due to YouTube’s recommendation AI.

September 10, 2019

Mozilla launched its #YouTubeRegrets campaign, urging users to share stories about harmful recommendations they encounter.

October 14, 2019

After meeting with YouTube staff to discuss this issue, Mozilla published three concrete actions the platform could take to help address its recommendation problem. These changes entail making YouTube’s recommendation data more transparent.

October 15, 2019

Mozilla published findings from our #YouTubeRegrets research, highlighting how users were recommended anti-science content, anti-LGBT content, and other harmful videos.

October 26, 2019

At MozFest 2020, Mozilla hosted “The Algorithmic Gaze,” a panel exploring the dangers of untrustworthy AI like YouTube’s recommendation engine.

December 25, 2019

Mozilla published a running list of YouTube’s dismissals related to its recommendation problem.

September 2020

Mozilla launches the RegretsReporter browser extension to research YouTube’s recommendation problem.