Mozilla Explains: Why Does YouTube Recommend Conspiracy Theory Videos?
Fear leads to anger. Anger leads to hate. Hate leads to...clicks?
Most of the videos you’d find on YouTube are free for you and me to watch. How, then, did YouTube rake in $6 billion in just three months this year? How is the site on its way to rivaling the earnings of Netflix — a service that charges all of its subscribers a monthly fee?
YouTube makes billions each month through the help of advertising. The more YouTube videos a viewer consumes, the more ad money Google (YouTube’s parent company) can rake in. Simply put, the longer YouTube can keep you on its site, the better for its pocketbook. It’s why, like we mentioned before, YouTube’s recommendation algorithm is intended to keep you on the site watching videos for as long as possible.
This isn’t always a bad thing, but oftentimes it can be. For example, content that is entertaining isn’t always factually correct or safe. “Hatred is useful for clickbait,” as Guillaume Chaslot puts it. The current Mozilla fellow and former Google engineer says as much in our latest Mozilla Explains. That hatred, in some cases, can lead to radicalization — where a viewer winds up falling down a rabbit hole filled with misleading and violent information, in some cases urged on by YouTube’s recommendation algorithm.
Users can quickly fall prey to a domino effect, where one conspiracy video leads to another. In fact, in our YouTube Regrets series we studied exactly this. We asked Mozilla supporters about the times they felt as if the algorithm suggested extreme content and thousands of them responded to tell us all about the weird places they were led to. From searching simple dance videos that led to videos about bodily harm to drag queen self esteem videos that transitioned eventually to anti-LGBTQ content. Users of YouTube were taken to some eerie corners of the site — all thanks to YouTube’s recommendation algorithm. Zoom out and this sequence of events happening repeatedly, while profitable for YouTube, can be dangerous for society.
So what do we do about all this? For starters, you can chime in and tell us about a time YouTube’s suggestion bot led you astray. Additionally, Chaslot recommends regulators step in and issue laws that begin to curb this. But there’s something you can do as well — arm yourself with knowledge. “When you know that YouTube is trying to manipulate you, it doesn’t work as well,” Chaslot says. Hate leads to clicks but only you can prevent yourself from turning to the dark side.