YouTube feedback buttons ‘Dislike,’ ‘Not interested,’ and others prevent less than half of unwanted algorithmic recommendations, according to study
Findings follow past research that revealed YouTube recommends offensive content that violates its very own guidelines
(SAN FRANCISCO, CA | TUESDAY, SEPTEMBER 20, 2022) -- YouTube’s user controls — buttons like “Dislike ” and “Not interested” — largely fail to help users avoid unwanted recommendations like misinformation and violent content, according to new research by Mozilla. An accompanying survey also found that YouTube’s controls routinely frustrate and confuse users.
Indeed, Mozilla’s research found that people who are experiencing unwanted recommendations and turn to the platform’s user controls for assistance prevent less than half of unwanted recommendations.
This is especially troubling because Mozilla’s past research shows that YouTube recommends videos that violate its very own community guidelines, like misinformation, violent content, hate speech, and spam. For example, one user in this most recent research asked YouTube to stop recommending war footage from Ukraine — but shortly after was recommended even more grisly content from the region.
The study, titled “Does This Button Work? Investigating YouTube's ineffective user controls” is the culmination of months of rigorous qualitative and quantitative research. The study was made possible by the data of more than 20,000 participants who used Mozilla's RegretsReporter browser extension, and by data about more than 500 million YouTube videos.
Says Becca Ricks, Senior Researcher at Mozilla: “We learned that people don’t feel YouTube’s user controls are effective tools for managing the content they see. Our research validates these experiences — the data shows that people don’t actually have much control over the YouTube algorithm.”
Says Jesse McCrosky, data scientist with Mozilla: “Our study found that YouTube’s user controls have a negligible impact on preventing unwanted recommendations, leaving people at the mercy of YouTube’s recommender system. As a result, YouTube continues to recommend videos that people have clearly signaled they do not want to see, including war footage and gruesome horror clips.”
YouTube continues to recommend videos that people have clearly signaled they do not want to see, including war footage and gruesome horror clips.
Jesse McCrosky, Mozilla
The qualitative aspect of the research entailed a survey with 2,758 of the 20,000 RegretsReporter users about their experiences with YouTube’s user controls. A majority (62.3%) of people felt the controls did nothing or had just mixed results to change their recommendations. The survey also revealed that people find YouTube’s controls difficult to understand, and as a result many users jury rig solutions to banish unwanted recommendations — like switching devices or accounts.
The quantitative aspect of the research confirmed users’ beliefs, and revealed the ineffectiveness of YouTube user control buttons “Dislike,” “Not interested,” “Don’t recommend channel,” and “Remove from watch history.” How? RegretsReporter overlaid a new “Stop recommending” button onto users’ recommended videos. This button would trigger one of YouTube’s aforementioned user controls — except in the case of the control group, when the button did nothing at all. Meanwhile, the extension tracked which videos people rejected and what videos YouTube subsequently recommended.
In addition to the proof that YouTube’s controls are ineffective, the research provided several startling anecdotes. For example, one user not in the control group asked YouTube to stop recommending firearm videos — but was shortly after recommended more gun content. Another user not in the control group asked YouTube to stop recommending cryptocurrency get-rich-quick videos — but was shortly after recommended yet another crypto clip.
Mozilla has been investigating YouTube's recommendation algorithm since 2019, given the AI’s outsized influence: YouTube is the second-most popular website in the world, and its algorithm drives most of the video views on the platform — more than channel subscriptions or search. Much of Mozilla’s research is conducted using RegretsReporter, the open-source browser extension that transforms regular users into citizen scientists.
TOP RESEARCH FINDINGS:
People don’t trust YouTube’s user controls. More than a third (39.3%) of people surveyed felt YouTube’s user controls did not impact their recommendations at all, and 23% felt the controls had a mixed response. Said one interviewee: “Nothing changed. Sometimes I would report things as misleading and spam and the next day it was back in [...] Even when you block certain sources they eventually return.”
People take matters into their own hands. Our study found that people did not always understand how YouTube’s controls affect their recommendations, and so took a jury rigged approach instead. People will log out, create new accounts, or use privacy tools just to manage their YouTube recommendations. Said one user: “When the Superbowl came around… if someone recommended a particular commercial, I used to log out of YouTube, watch the commercial, and then log back in.”
The data confirms people are right. The most “effective” user control was “Don’t recommend channel,” but compared to users who do not make use of YouTube’s user controls, only 43% of unwanted recommendations are prevented — and recommendations from the unwanted channel sometimes persist. Other controls were even less effective: The “Not Interested” tool prevented only 11% of unwanted recommendations.
YouTube can fix this problem. YouTube has the power to confront this issue and do a better job at enabling people to control their recommendations. Our research outlines several concrete suggestions to put people back into the driver’s seat, like making YouTube’s controls more proactive, allowing users to shape their own experience; and giving researchers increased access to YouTube’s API and other tools.