Browser extension investigates whether people receive unwanted recommendations, even after opting out

Tool builds on Mozilla’s crowdsourced research into YouTube’s black box AI

(SAN FRANCISCO, CA | DECEMBER 2, 2021) -- On the heels of research exposing how YouTube recommends videos that violate its own policies, Mozilla has built a tool to test whether the platform’s algorithmic controls actually work as advertised.

YouTube’s algorithmic controls purportedly allow people to reject recommendations they don’t like, which trains YouTube’s algorithm not to surface similar videos. These controls include buttons like “Not interested,” “Dislike,” and “Don’t recommend channel,” as well as the ability to pause or turn off watch and search history. Given YouTube can recommend videos that violate its very own policies — like public health and electoral misinformation — these features have outsized importance.

Now, Mozilla’s RegretsReporter browser extension will determine if these features truly give people control over the algorithm. Or rather, if they are purely cosmetic, enabling the algorithm to continue to recommend unwanted and sometimes harmful videos.

DOWNLOAD THE EXTENSION

The extension manipulates YouTube’s very own UX design, making user feedback easier and more accessible. YouTube’s current design requires at least two clicks to flag a recommendation as unwanted. RegretsReporter adds a single button, and overlays it onto the thumbnail of each and every video that is recommended or watched. The button reads “Stop Recommending” in bold letters, allowing people to quickly and easily reject recommendations.

As a result, YouTube users become YouTube watchdogs: Over time and as more people use the extension, users will crowdsource insight into whether YouTube’s algorithm is actually listening to feedback. Currently, YouTube doesn't allow independent researchers access to any data like this.

Says Brandi Geurkink, Mozilla’s Senior Manager of Advocacy: “Countless investigations have revealed how YouTube’s algorithm sends people down harmful rabbit holes. Meanwhile, YouTube has failed to address this problem, remaining opaque about how its recommendation AI — and its algorithmic controls — works. Our research will determine if this feature performs as intended, or if it’s the equivalent of an elevator “door close” button — purely cosmetic.”

Our research will determine if YouTube's algorithmic controls performs as intended, or if they're the equivalent of an elevator 'door close' button — purely cosmetic.

Brandi Geurkink, Mozilla

Geurkink continues: “This extension also makes a statement about the role AI plays in consumer technology. Just like privacy controls should be clear and accessible, so should algorithmic controls.”

This extension comes on the heels of Mozilla’s July 2021 research, which revealed YouTube’s algorithm recommends videos that violate the platform’s very own policies. That research was fueled by an earlier version of the RegretsReporter extension, which allowed users to report harmful recommendations. That functionality still remains in this new version.