In 2019, Mozilla collected countless stories from people whose lives were impacted by YouTube’s recommendation algorithm. People had been exposed to misinformation, developed unhealthy body images, and became trapped in bizarre rabbit holes. The more stories we read, the more we realized how central YouTube has become to the lives of so many people — and just how much YouTube recommendations can impact their wellbeing.
Yet when confronted about its recommendation algorithm, YouTube’s routine response is to deny and deflect.
Faced with YouTube’s continuous refusal to engage, Mozilla built a browser extension, RegretsReporter, that allows people to donate data about the YouTube videos they regret watching on the platform. Through one of the biggest crowdsourced investigations into YouTube, we’ve uncovered new information that warrants urgent attention and action from lawmakers and the public.
We now know that YouTube is recommending videos that violate their very own content policies and harm people around the world — and that needs to stop.
What is a YouTube Regret?
The concept of a “YouTube Regret” was born out of a crowdsourced campaign that Mozilla developed in 2019 to collect stories about YouTube’s recommendation algorithm leading people down bizarre and sometimes dangerous pathways. We have intentionally steered away from strictly defining what “counts” as a YouTube Regret to allow people to define the full spectrum of regrettable experiences that they have on YouTube.
What a YouTube Regret looks like for one person may not be the same for another, and sometimes YouTube Regrets only emerge after months or weeks of being recommended videos—an experience which is better conveyed through stories than video datasets.
This approach intentionally centers the lived experiences of people to add to what is often a highly legalistic and theoretical discussion. It does not yield clear distinctions about what type of videos should or should not be on YouTube or algorithmically promoted by YouTube.
Sometimes YouTube Regrets are
Global Warming: Fact or Fiction? Featuring Physicists Willie Soon and Elliott D. Bloom
A volunteer reported that they were surprised to see a video denying climate change suggested after they watched a Mozilla video!
Omar Connected Harvester SEEN Exchanging $200 for General Election Ballot. “We don’t care illegal.”
A volunteer reported that despite mainly watching wilderness survival videos and no right-wing politics, they were recommended a video spreading an unfounded claim about U.S. Representative Ilhan Omar and voter fraud in the 2020 U.S. elections.
Man hum1liates f3m1n1st in v1ral video
A volunteer reported a video where the narrator claims to “completely shut down and embarass three women that have opinions about how society should be in first world feminist countries” that was recommended after watching videos about the U.S. military.
We set a high bar for what videos we display prominently in our recommendations on the YouTube homepage or through the 'Up next' panel.Youtube’s blog
The most common reported category was misinformation, especially covid-19 related.
While our research considers the broad spectrum of YouTube Regrets, it is important to note that some videos are worse than others. YouTube has Community Guidelines that set the rules for what is allowed on YouTube and what isn’t. Together with a team of research assistants from the University of Exeter, we evaluated the videos reported to us against these Community Guidelines and made our own determination of which videos should either not be on YouTube or not be recommended by YouTube. Of those videos, the most common category applicable was misinformation (especially COVID-19 related). The other largest categories were violent or graphic content, hate speech, and spam/scams. Other notable categories include child safety, harassment & cyberbullying, and animal abuse.
Types of Regretted Content
Our research is powered by real YouTube users.
Specifically: 37,380 volunteers across 190 countries installed Mozilla’s RegretsReporter browser extensions for Firefox and Chrome.
For this report, We gathered from
Reports were submitted between July 2020 - May 2021. Volunteers who downloaded the extension but did not file a report were an important part of our study. Their data — for example, how often they use YouTube — was essential to our understanding of how frequent regrettable experiences are on YouTube and how this varies between countries.
After delving into the data with a team of research assistants from the University of Exeter, Mozilla came away with three major criticisms of YouTube recommendations: from a problematic algorithm to poor corporate oversight to geographic disparities. And our past research shows that these problems can inflict lasting harm on people.
Here’s what we found
Most of the videos people regret watching come from recommendations.
YouTube Regrets are primarily a result of the recommendation algorithm, meaning videos that YouTube chooses to amplify, rather than videos that people sought out.
71% of all Regret reports came from videos recommended to our volunteers and recommended videos were 40% more likely to be regretted than videos searched for.
The YouTube Regrets our volunteers flagged had tons of views, but it’s impossible to tell how many of those views came from YouTube’s recommendation algorithm, because YouTube won’t release this information.
views per day
views per day
What we do know is that reported videos seemed to accumulate views faster than those that were not.
At the time they were reported, YouTube Regrets had a median of 5,794 views per day they were on the platform, while other videos our volunteers watched had only 3,312 views per day.
YouTube recommends videos that violate their own policies.
Our data shows that YouTube’s algorithm recommended several videos that were later removed from the platform for violating YouTube’s own Community Guidelines.
In 40% of cases where recommended YouTube Regrets were taken down from YouTube, YouTube did not provide data about the reason why these videos were removed.
What we do know is at the time they were reported they had racked up a collective 160 million views, an average of 760 thousand views per video, accrued over an average of 5 months that the videos had been up at the time they were reported. It is impossible to tell how many of these views were a result of recommendations, because YouTube won’t release this information.
views over ~5 months
views per video
Non-English speakers are hit the hardest.
When YouTube is called out for recommending borderline content, the company usually boasts that their policy changes have led to a “70% average drop in watch time of this content coming from non-subscribed recommendations in the U.S.” But what about the rest of the world?
We found that the rate of regrets is 60% higher in countries where English is not the primary language.
This correlates with statements made by YouTube that suggest they have focused these policy changes on English-speaking countries first.
*Among countries classified as having English as a primary language, the rate is 11.0 Regrets per 10000 videos watched (95% confidence interval is 10.4 to 11.7). In countries with a non-English primary language, the rate is 60% higher at 17.5 Regrets per 10000 videos watched (95% confidence interval is 16.8 to 18.3)
of non-english videos were pandemic-related
of english videos were pandemic-related
We also found that misleading or harmful pandemic-related videos are especially prevalent among non-English regrets.
Of the YouTube Regrets that we determined shouldn’t be on YouTube or shouldn’t be recommended by YouTube, we found that only 14% of English videos were pandemic-related. But among non-English videos, the rate is 36%.
YouTube Regrets can alter people’s lives forever.
Our past work has shown that these video recommendations can have significant impacts on people’s lives.
“In coming out to myself and close friends as transgender, my biggest regret was turning to YouTube to hear the stories of other trans and queer people. Simply typing in the word 'transgender' brought up countless videos that were essentially describing my struggle as a mental illness and as something that shouldn't exist. YouTube reminded me why I hid in the closet for so many years.
Every now and then YouTube will continue to recommend me a video that tells me that my gender identity is wrong — and it reminds me of how much hate is squarely directed at me and people like me. I'm somewhat older, I've been dealing with these issues internally for a long time and I have therapy to work out these issues, but I can't imagine what it's like for those without access to help. YouTube is a part of my pain in coming out and is a reminder of how terrible this world can be to those who are different. I have to be proud in spite of places like YouTube.”
“When my son was preschool age, he liked to watch 'Thomas the Tank Engine' videos on YouTube. One time when I checked on him, he was watching a video compilation that contained graphic depictions of train wrecks.”
“My eight-year-old granddaughter was using our tablet to watch children's videos; unknowingly as grandparents we thought she was safe. The next day we watched as our granddaughter sat at the dinner table speaking candidly to her parents about her fear of cancer.
As it turned out, in the middle of watching how children with medical challenges can do great, inspirational things, the video turned dark and began showing horrifying, raw footage of rotting skins and disfiguration. Now we are working with her to understand cancer and to not fear eating food or drinking water.”
“My 10-year-old sweet daughter innocently searched for 'tap dance videos' and now is in this spiral of recommended videos of extreme 'dance' and contortionist videos that give her horrible unsafe body-harming and body-image-damaging advice... She is now restricting her eating and drinking. I don’t know how I can undo the damage that’s been done to her impressionable mind.”
“My ex-wife, who has mental health problems, started watching conspiracy videos three years ago and believed every single one. YouTube just kept feeding her paranoia, fear and anxiety one video after another. I kept begging her to stop, but she didn't — she couldn't. At one point she believed a helicopter near the house was the government coming to take her and my daughter away (they were really checking the power lines) and called in a blind panic. Now she's convinced the world is going to end any day now and is an extreme religious fundamentalist.
She refuses to even consider professional help because she no longer trusts anyone — especially doctors, the police and any government-run organisation. And YouTube just keeps feeding her more and more of the fear videos. My marriage is now over. Her extraordinary fear has totally consumed her and our life together.”
Our recommendation: Urgent action must be taken to rein in YouTube’s algorithm.
YouTube’s algorithm drives 700 million hours of watch time on the platform every day. The consequences of this are simply too great to trust them to fix this on their own. Below, find a summary of Mozilla’s recommendations to YouTube, policymakers, and people who use YouTube. Details on these recommendations can be found in the full report.
Policymakers must require YouTube to release adequate public data about their algorithm and create research tools and legal protections that allow for real, independent scrutiny of the platform.
YouTube and other companies must publish audits of their algorithm and give people meaningful control over how their data is used for recommendations, including allowing people to opt-out of personalized recommendations.
Mozilla will continue to operate RegretsReporter as an independent tool for scrutinising YouTube’s recommendation algorithm. We plan to update the extension to give people an easier way to access YouTube’s user controls to block unwanted recommendations, and will continue to publish our analysis and future recommendations.