RegretsReporter
This RegretsReporter browser extension, built by the nonprofit Mozilla, helps you take control of your YouTube recommendations.
YouTube User Control Study Findings
Mozilla scrutinized YouTube to determine how much control people actually have over the platform’s recommendation algorithm, powered by data donated by 22,722 people. This is what we learned.
RegretsReporter Dataset
The RegretsReporter dataset can be used for research and investigations into the impact of YouTube's recommendation engine on communities around the world.
YouTube Regrets Findings
Mozilla and 37,380 YouTube users conducted a study to better understand harmful YouTube recommendations. This is what we learned.
YouTube Regrets Stories
Got YouTube Regrets? Thousands do! These are the stories Mozilla collected when we asked people about the videos they wish they’d never clicked that sent them down a rabbit hole…
Blog Posts
-
Advocacy 6 lipca 2020
Congratulations, YouTube... Now Show Your Work
Earlier this week, YouTube finally acknowledged their recommendation engine suggests harmful content. It’s a small step in the right direction, but YouTube still has a long history of dismissing independent researchers. We created a timeline to prove it.
Brandi Geurkink i Helena McDonald
July 2019 - Mozilla awards Guillaume Chaslot, a former YouTube engineer, a fellowship to support his work investigating AI systems, including the impacts of YouTube’s recommendation engine.
August 7, 2019 - Mozilla announced our intentions to investigate YouTube’s recommendation problem, and to begin a dialogue with the company about making improvements.
August 14, 2019 - Mozilla published a reading list for our community detailing the ways recommendation AI — including YouTube’s — can manipulate consumers.
September 3, 2019 - Mozilla hosted a Q&A with a teenager who was radicalized online, partially due to YouTube’s recommendation AI.
September 10, 2019 - Mozilla launched its #YouTubeRegrets campaign, urging users to share stories about harmful recommendations they encounter.
October 14, 2019 - After meeting with YouTube staff to discuss this issue, Mozilla published three concrete actions the platform could take to help address its recommendation problem. These changes entail making YouTube’s recommendation data more transparent.
October 15, 2019 - Mozilla published findings from our #YouTubeRegrets research, highlighting how users were recommended anti-science content, anti-LGBT content, and other harmful videos.
October 26, 2019 - At MozFest 2019, Mozilla hosted “The Algorithmic Gaze,” a panel exploring the dangers of untrustworthy AI like YouTube’s recommendation engine.
December 25, 2019 - Mozilla published a running list of YouTube’s dismissals related to its recommendation problem.
September 2020 - Mozilla launches the RegretsReporter browser extension to research YouTube’s recommendation problem.
July 2021 - Using data contributed by 37,380 RegretsReporter users, Mozilla releases first findings into YouTube's recommendations.
Sept 2022 - Mozilla releases a report scrutinizing YouTube to determine how much control people actually have over the platform’s recommendation algorithm.