YouTube

Since 2019, Mozilla has done research on YouTube's recommendation system and advocated for greater transparency. Our original research, investigations and campaigns have demonstrated some of the systemtic problems with YouTube's recommendation algorithm, and we suggest solutions.


Current Work

RegretsReporter

This RegretsReporter browser extension, built by the nonprofit Mozilla, helps you take control of your YouTube recommendations.

See the extension

YouTube User Control Study Findings

Mozilla scrutinized YouTube to determine how much control people actually have over the platform’s recommendation algorithm, powered by data donated by 22,722 people. This is what we learned.

Read the findings

Illustration of a magnifying glass, people, charts and graphs
RegretsReporter Dataset

The RegretsReporter dataset can be used for research and investigations into the impact of YouTube's recommendation engine on communities around the world.

Learn more

Download the dataset


Past Work

YouTube Regrets Findings

Mozilla and 37,380 YouTube users conducted a study to better understand harmful YouTube recommendations. This is what we learned.

Read the findings

YouTube Regrets Stories

Got YouTube Regrets? Thousands do! These are the stories Mozilla collected when we asked people about the videos they wish they’d never clicked that sent them down a rabbit hole…

Read more


Impact

timelineV2@2x-80.jpg

July 2019 - Mozilla awards Guillaume Chaslot, a former YouTube engineer, a fellowship to support his work investigating AI systems, including the impacts of YouTube’s recommendation engine.

timelineV2@2x-80.jpg

August 7, 2019 - Mozilla announced our intentions to investigate YouTube’s recommendation problem, and to begin a dialogue with the company about making improvements.

timelineV2@2x-80.jpg

August 14, 2019 - Mozilla published a reading list for our community detailing the ways recommendation AI — including YouTube’s — can manipulate consumers.

timelineV2@2x-80.jpg

September 3, 2019 - Mozilla hosted a Q&A with a teenager who was radicalized online, partially due to YouTube’s recommendation AI.

timelineV2@2x-80.jpg

September 10, 2019 - Mozilla launched its #YouTubeRegrets campaign, urging users to share stories about harmful recommendations they encounter.

timelineV2@2x-80.jpg

October 14, 2019 - After meeting with YouTube staff to discuss this issue, Mozilla published three concrete actions the platform could take to help address its recommendation problem. These changes entail making YouTube’s recommendation data more transparent.

timelineV2@2x-80.jpg

October 15, 2019 - Mozilla published findings from our #YouTubeRegrets research, highlighting how users were recommended anti-science content, anti-LGBT content, and other harmful videos.

timelineV2@2x-80.jpg

October 26, 2019 - At MozFest 2019, Mozilla hosted “The Algorithmic Gaze,” a panel exploring the dangers of untrustworthy AI like YouTube’s recommendation engine.

timelineV2@2x-80.jpg

December 25, 2019 - Mozilla published a running list of YouTube’s dismissals related to its recommendation problem.

timelineV2@2x-80.jpg

September 2020 - Mozilla launches the RegretsReporter browser extension to research YouTube’s recommendation problem.

timelineV2@2x-80.jpg

July 2021 - Using data contributed by 37,380 RegretsReporter users, Mozilla releases first findings into YouTube's recommendations.

timelineV2@2x-80.jpg

Sept 2022 - Mozilla releases a report scrutinizing YouTube to determine how much control people actually have over the platform’s recommendation algorithm.

Current Work

RegretsReporter

This RegretsReporter browser extension, built by the nonprofit Mozilla, helps you take control of your YouTube recommendations.

See the extension

YouTube User Control Study Findings

Mozilla scrutinized YouTube to determine how much control people actually have over the platform’s recommendation algorithm, powered by data donated by 22,722 people. This is what we learned.

Read the findings

Illustration of a magnifying glass, people, charts and graphs
RegretsReporter Dataset

The RegretsReporter dataset can be used for research and investigations into the impact of YouTube's recommendation engine on communities around the world.

Learn more

Download the dataset


Past Work

YouTube Regrets Findings

Mozilla and 37,380 YouTube users conducted a study to better understand harmful YouTube recommendations. This is what we learned.

Read the findings

YouTube Regrets Stories

Got YouTube Regrets? Thousands do! These are the stories Mozilla collected when we asked people about the videos they wish they’d never clicked that sent them down a rabbit hole…

Read more


Impact

timelineV2@2x-80.jpg

July 2019 - Mozilla awards Guillaume Chaslot, a former YouTube engineer, a fellowship to support his work investigating AI systems, including the impacts of YouTube’s recommendation engine.

timelineV2@2x-80.jpg

August 7, 2019 - Mozilla announced our intentions to investigate YouTube’s recommendation problem, and to begin a dialogue with the company about making improvements.

timelineV2@2x-80.jpg

August 14, 2019 - Mozilla published a reading list for our community detailing the ways recommendation AI — including YouTube’s — can manipulate consumers.

timelineV2@2x-80.jpg

September 3, 2019 - Mozilla hosted a Q&A with a teenager who was radicalized online, partially due to YouTube’s recommendation AI.

timelineV2@2x-80.jpg

September 10, 2019 - Mozilla launched its #YouTubeRegrets campaign, urging users to share stories about harmful recommendations they encounter.

timelineV2@2x-80.jpg

October 14, 2019 - After meeting with YouTube staff to discuss this issue, Mozilla published three concrete actions the platform could take to help address its recommendation problem. These changes entail making YouTube’s recommendation data more transparent.

timelineV2@2x-80.jpg

October 15, 2019 - Mozilla published findings from our #YouTubeRegrets research, highlighting how users were recommended anti-science content, anti-LGBT content, and other harmful videos.

timelineV2@2x-80.jpg

October 26, 2019 - At MozFest 2019, Mozilla hosted “The Algorithmic Gaze,” a panel exploring the dangers of untrustworthy AI like YouTube’s recommendation engine.

timelineV2@2x-80.jpg

December 25, 2019 - Mozilla published a running list of YouTube’s dismissals related to its recommendation problem.

timelineV2@2x-80.jpg

September 2020 - Mozilla launches the RegretsReporter browser extension to research YouTube’s recommendation problem.

timelineV2@2x-80.jpg

July 2021 - Using data contributed by 37,380 RegretsReporter users, Mozilla releases first findings into YouTube's recommendations.

timelineV2@2x-80.jpg

Sept 2022 - Mozilla releases a report scrutinizing YouTube to determine how much control people actually have over the platform’s recommendation algorithm.