Hero Image

‘Nothing changed’: A qualitative analysis of YouTube’s user controls

Overview

People test out many different strategies to control their experience with YouTube’s recommendation algorithm. Many people use the controls that YouTube explicitly suggests like the “Don’t Recommend Channel” button, but many also engage in different behaviors that YouTube may not anticipate, taking a trial-and-error approach to controlling their experience.

The beliefs that people hold about social media algorithms shape the way they behave online. People develop their own theories about why certain content does or doesn’t show up in their feed[1] and they may change how they present themselves on the platform in response.[2] They may use different strategies to “game” or control the algorithm, depending on their objectives. In the past, communities of content creators on YouTube have used various tactics to manage how much they show up on the platform,[3] including strategically using thumbnail images to boost views.[4] Many researchers treat such actions as legitimate attempts by people to exercise control in the face of information or power asymmetry.[5] These tactics are ways to wrest back control from the algorithm, which can be characterized as a form of user resistance.[6]

In the qualitative research we conducted, we looked at how people talk about their experiences with YouTube’s recommender system. We surveyed 2,757 YouTube users and conducted user interviews in order to better understand people’s feelings of control and autonomy in relation to the platform.

The people who chose to participate in this study are not a representative sample of YouTube’s user base. They are people who voluntarily downloaded the RegretsReporter browser extension and agreed to complete a survey and/or interview. We assume that many of them came to this experiment with a desire to express their feelings about the platform, which prevents us from drawing insights about all YouTube users.

Nevertheless, in this study we take seriously the complaints that people express about their experience with YouTube. Referencing Sara Ahmed’s thinking on refusal, resignation and complaint, Burrell and others state that “the act of complaint itself can be a way for people to record their grievances and build solidarity in the face of limited recognition.”[7] Building on their insight, we believe it’s crucial to understand people’s grievances with algorithmic systems in order to understand their expectations for how such systems should behave. In the case of YouTube, such complaints can help us better understand the frustrations that users face in relation to the platform’s recommendation algorithm.

Research questions

We carried out a qualitative study to better understand whether people felt they were in control of their YouTube experience. Going into this study, we had three primary research questions:

  1. What steps do YouTube users take to control their video recommendations?
  2. Do they feel that their recommendations change when they use YouTube’s user controls?
  3. Do they feel that they have control over their recommendations?

We also had some secondary questions, including: How easy is it for YouTube users to find information about how the algorithm works? What information do they want to know? What kinds of alternative features or tools do they say they want? Note that we won’t cover our analysis of answers to these questions in this report.

Research setup

Data collection

To answer our research questions, we ran a survey that invited people to reflect on a specific experience they’d had when they had tried to curate or control their video recommendations: What steps did they take? Why? What kind of effect did they think it had on their experience? Did they feel like they were in control?

Our RegretsReporter community was invited to fill out the optional survey upon download of the browser extension. The survey ran for four months from Dec 2021 – April 2022, with a total of 2,757 responses received. We did not collect demographic data so we do not have detailed information about the sample.

We also conducted semi-structured interviews in May 2022 with five people who had filled out our survey. Interview participants were selected at random from a pool of people who indicated they were willing to be interviewed by Mozilla researchers, and questions were tailored to their specific survey responses. The goal of the interviews was to contextualize and deepen our understanding of some of the themes that emerged in the survey responses.

Analysis

Once we collected the survey data, we performed a content analysis[8] of the responses, categorizing and tagging them according to a coding schema that combined inductive and deductive approaches. We began with a set of codes that reflect the kinds of responses we anticipated receiving, but ultimately codes were developed based on the kinds of responses we received. We did not code the interviews, but we did rely on them as sources for rich contextual information that informed our coding of the survey responses.

Through the coding process, we identified the most prominent codes or themes. Our findings were structured around these themes, which looked at:

  • The control strategies and behaviors people used;
  • People’s impressions of whether their control strategies were successful; and
  • People’s impressions about how the algorithm behaved in response to their control strategies.

Findings


1. People use a broad range of tactics and behaviors in an attempt to control their YouTube recommendations.

Survey respondents were asked to walk us through a specific experience or set of experiences in which they had taken steps to control their YouTube recommendations. From those responses, we identified several tactics, behaviors, and actions that emerged:

Strategy: Used YouTube’s feedback tools

Survey participants mentioned using one or more of YouTube’s feedback tools, often in combination with other behaviors. Of participants who said they took some steps to control their recommendations, 78.3% mentioned using YouTube’s existing feedback tools and/or changing YouTube settings.

Strategy: Used privacy tools and behaviors

It’s not surprising to us that RegretsReporter volunteers engage in privacy-conscious behaviors. Participants talked about using VPNs (15 mentions) and privacy browser extensions (17 mentions) to manage recommendations. They also mentioned routinely deleting cookies (31 mentions) and clearing their browser history. One participant put it this way:

"Stop YouTube from keeping track of my history. Turn off autoplay. Maxed out my YouTube privacy settings. Most successful and satisfactory, though, has been logging out of Google and staying logged out. YouTube still gives me recommendations (I guess it's tracking my usage based on my IP address) but it is a lot less bad." (Survey ID2319)

Survey participants also changed their browser settings in an attempt to avoid unwanted recommendations and protect their privacy. People said that they watched certain videos in private browsing or incognito mode (43 mentions) or by creating a new account just to watch certain videos (73 mentions).

Participants emphasized that there are situations in which they may want to watch a video on YouTube but don’t want it to affect their recommendations. In those cases, people said they viewed the video in private browsing mode, by logging out of their account, or by removing the video from their watch history. One interviewee describes their experience watching Superbowl commercials on YouTube:

"When the Superbowl came around… if someone recommended a particular commercial, I used to log out of YouTube, watch the commercial, and then log back in. I have sometimes even gone to a different device just because… I don't want that clouding my recommendations." (Interview ID4)

Strategy: Adjusted viewing behaviors

Participants told us that they intentionally only clicked on and watched those videos they wanted to be recommended. Some said that they proactively rewatched videos they liked in an effort to “teach” the algorithm about their interests. Several participants said that they only watched videos from their subscriptions (65 mentions) or specific topics. For instance:

"I made a conscious effort to only click on videos that pertained to a ‘target’ category, such as political videos." (Survey ID1615)

Other people actively used features like the search bar (184 mentions) as a tool to influence their recommendations.

"Searched for known credible sources to reset recommendation algorithm" (Survey ID2301)

"Search for some good content or topics and pray the recommendations would adjust after watching them." (Survey ID1558)

Similarly, a number of people said that they intentionally avoided or ignored certain videos (224 mentions) in order to skirt bad recommendations and to “teach” the algorithm about their interests. Some participants said they even avoided “hovering” over unwanted videos (6 mentions) as a way to prevent unwanted recommendations.

"I usually just avoid watching questionable materials so as not to ‘feed’ the algorithm" (Survey ID1762)

"I avoided watching videos I knew were designed to be clickbait, to avoid filling the recommended stream with clickbait videos." (Survey ID2347)

Interestingly, not all of the videos people avoided were completely “unwanted” — some people said they were interested in a video or a topic, but they still avoided it because they were worried that the YouTube algorithm would over-recommend similar content in the future. Some of our participants described their experiences this way:

"I avoided videos on subjects that I found interesting, but which the algorithm was giving too much emphasis to." (Survey ID2589)

"I avoided clicking a video I would like to watch, only because I was worried that doing so would lead me to get politically extreme recommendations." (Survey ID113)

"I actively avoid content that I may want to watch as a guilty pleasure because I don't want recommendations for that kind of content in the main." (Survey ID1064)

Others said they closed the browser window or tab (25 mentions) where they saw the video, with hopes that the platform would interpret it as a negative feedback signal. Others simply stopped using YouTube altogether (5 mentions).

2. More than a third of people said that using YouTube’s controls did not change their recommendations at all.

We learned that people are generally not satisfied with YouTube’s user controls. A significant minority (39.3%) of people who used YouTube’s controls did not feel that doing so impacted their recommendations at all. Just over a quarter (27.6%) felt that their recommendations did change in response, and fewer (23.0%) felt the system had an ambivalent or mixed response. Responses that did not mention engaging with YouTube’s user controls were exempt from this particular analysis (596 responses).

One participant who felt the algorithm had no response explained it:

"Nothing changed. Sometimes I would report things as misleading and spam and the next day it was back in. It almost feels like the more negative feedback I provide to their suggestions the higher bullshit mountain gets. Even when you block certain sources they eventually return." (Survey ID915)

A participant who had a more positive experience said:

"The channels I asked not to be shown were not shown anymore. Also, lots of times instead of getting relevant recommendations I get a section called 'watch again' with some videos from creators I am already subscribed or that I've seen already." (Survey ID927)

Many participants had ambivalent or conflicted feelings about how the algorithm behaved in response to their actions. One participant put it this way:

"They did sort of change, although I feel that ‘she likes music theory’ info packet that the algorithm had didn't get removed entirely, just lowered in priority. I still get the recommendations, and I still shoot them down; the recommendations are no longer half of my home page, maybe like 1/6th of my page." (Survey ID811)

Another participant expressed that the algorithm changed in response, but not for the better:

"Yes they did change, but in a bad way. In a way, I feel punished for proactively trying to change the algorithm's behavior. In some ways, less interaction provides less data on which to base the recommendations." (Survey ID112)


3. Of those people who had mixed experiences, common themes included unwanted videos popping up, controls that don’t work as anticipated, and significant effort.

Many of our participants were conflicted about whether or not using YouTube’s tools impacted their recommendations and their overall user experience. Among people who had ambivalent or mixed responses to this question, several common experiences emerged. During the inductive coding process, we categorized responses with mixed experiences as follows:


(a) People said that at first their recommendations changed, but eventually unwanted videos “crept” back into their recommendations over time.

This theme was present in 23.7% of coded responses from our “mixed” group. This group of users noticed some positive changes after they used the controls, but said that over time unwanted recommendations would return. Some blamed themselves for accidentally clicking a clickbait video and ruining their recommendations, while others blamed the algorithm for giving too much weight to clickbait videos.

“They change for a time, but reappear later on again. Some recommendations seem to be driven by trend [sic] created by larger audiences…The algorithm favours these trends and overwrites individual selections. I do not think it creates a general profile of individual users, or if so ignores it after a while.” (Survey ID265)

“Eventually it always comes back. The algorithm seems incapable of remembering a lesson for very long.” (Survey ID187)


(b) Many people said that clicking the “Don’t Recommend Channel” button seemed to be most effective at blocking a specific creator or channel, but they said that they continued to get recommended similar videos from different channels.

This theme was present in 12.3% of coded responses from our “mixed” group. This group of users observed that when they clicked “Don’t Recommend Channel” it seemed to actually work most of the time — videos from a particular channel were mostly blocked. However, they said that they would continue to get recommended videos from different channels on similar topics. This experience highlights an expectation gap we encountered throughout our research: People simply do not have clear information about what each of these controls are designed to do.

“They do not re-recommend the precise channels I say an outright 'no' to but still recommend alternative channels of almost identical content in their place.” (Survey ID112)

“That specific channel was hidden, but other related videos popped up. Takes several other ‘hide channels’ plus watching videos on a completely different topic to fade those out. Sometimes they still come back.” (Survey ID112)


(c) Some people said that gradually their recommendations changed, but it took a significant amount of time and sustained effort on their part.

This theme was present in 9.3% of coded responses from our “mixed” group. People emphasized that changing their recommendations required vigilance: Repeatedly sending feedback signals to YouTube, curating their subscriptions and watch history, and avoiding certain videos. Many expressed frustration about the effort required, or how slowly the algorithm changed.

“I feel like I have to constantly curate the videos YT thinks I will like, or wants me to like. It's an ongoing battle.” (Survey ID112)

“It seems like you routinely have to prune things away or it will keep shoving them in your face until you tell it otherwise.” (Survey ID112)


Takeaway

YouTube’s user controls aren’t designed in a way that allows people to actively design their experience on the platform. Our research demonstrates that YouTube’s current controls leave people feeling confused and frustrated. Participants say they don’t understand how YouTube’s feedback tools impact what they are recommended, and often do not feel in control of their experience on the platform. Many resort to a trial-and-error approach that mixes tools, behaviors, and tactics, with limited success.

In the next section of the report, we examine whether people’s feelings that they are not in control were validated when we analyzed the interaction data we collected from RegretsReporter users.


Footnotes

  1. [1]

    Taina Bucher, “The Algorithmic Imaginary: Exploring the Ordinary Affects of Facebook Algorithms,” Information, Communication & Society 20, no. 1 (January 2, 2017): 30–44, https://doi.org/10.1080/1369118X.2016.1154086.

  2. [2]

    Michael Ann DeVito, “Adaptive Folk Theorization as a Path to Algorithmic Literacy on Changing Platforms,” Proceedings of the ACM on Human-Computer Interaction 5, no. CSCW2 (October 18, 2021): 339:1-339:38, https://doi.org/10.1145/3476080.

  3. [3]

    Sophie Bishop, “Anxiety, Panic and Self-Optimization: Inequalities and the YouTube Algorithm,” Convergence 24, no. 1 (February 1, 2018): 69–84, https://doi.org/10.1177/1354856517736978.

  4. [4]

    Taina Bucher, “Cleavage-Control: Stories of Algorithmic Culture and Power in the Case of the YouTube ‘Reply Girls,’” in A Networked Self and Platforms, Stories, Connections (Routledge, 2018).

  5. [5]

    Jenna Burrell, Zoe Kahn, Anne Jonas, and Daniel Griffin, “When Users Control the Algorithms: Values Expressed in Practices on Twitter,” Proceedings of the ACM on Human-Computer Interaction 3, no. CSCW (November 7, 2019): 138:1-138:20, https://doi.org/10.1145/3359240.

  6. [6]

    Michael A. DeVito, et al., “‘Algorithms Ruin Everything’: #RIPTwitter, Folk Theories, and Resistance to Algorithmic Change in Social Media,” in Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, CHI ’17 (New York, NY, USA: Association for Computing Machinery, 2017), 3163–74, https://doi.org/10.1145/3025453.3025659.

  7. [7]

    Burrell et al.

  8. [8]

    Johnny Saldaña, “An Introduction to Codes and Coding,” in The Coding Manual for Qualitative Researchers, 3rd edition (Sage, 2016).