YouTube Regrets Findings

July 1, 2021
Platform accountability / AI fairness, accountability, and transparency / Tech policy & regulation
YouTube Regrets Report - Share image.jpg


YouTube is the second-most visited website in the world, and its algorithm drives 70% of watch time on the platform—an estimated 700 million hours every 1 single day. For years, that recommendation algorithm has helped spread health misinformation, political disinformation, hateful diatribes, and other regrettable content to people around the globe. YouTube’s enormous influence means these films reach a huge audience, having a deep impact on countless lives, from radicalization to polarization. And yet YouTube has met this criticism with inertia and opacity.

In 2020, after years of advocating for YouTube to be more transparent about their recommendation algorithm and allow researchers to study the platform, Mozilla responded to the platform’s inaction by empowering its users to speak up instead. We launched RegretsReporter, a browser extension and crowdsourced research project to better understand the harms that YouTube’s algorithm can inflict on people. 37,380 YouTube users stepped up as YouTube watch dogs, volunteering data about the regrettable experiences they have on YouTube for Mozilla researchers to carefully analyze. As a result, Mozilla gained insight into a pool of YouTube’s tightly-held data in the largest-ever crowdsourced investigation into YouTube’s algorithm. Collectively, these volunteers flagged 3,362 regrettable videos, coming from 91 countries, between July 2020 and May 2021.