Misinfo Monday: How Recommendation Algorithms Can Lead Us Astray

Misinfo Monday is a weekly series by Mozilla where we give you the tools, tips and tricks needed to cut the crap and find the truth. Our guest post this week comes from Brandi Geurkink, Senior Campaigner at Mozilla Foundation and head of Mozilla’s YouTube Regrets effort. Learn more about it here. For more Misinfo Monday, check back weekly on ourblog or on our Instagram.

Over the weekend, a tweet caught my attention. 59 days before the US election, during a time of heightened racial and economic tension, YouTube sent a push notification to an unknown number of users suggesting that they watch a video called “Civil War Is Coming” by the channel Heavy Duty Country. One user shared that none of his previous viewing habits would have suggested he would be at all interested in that video.

@jeremys on Twitter, How Google is destroying American society exhibit A.
Yelp CEO Jeremy Stoppelman spotted a notification from YouTube promoting a video from the channel Heavy Duty Country

It caught my attention because of how similar it was to thousands of stories that I read almost one year ago when we launched YouTube Regrets—a campaign where we used real YouTube recommendation horror stories submitted by our supporters to pressure YouTube to take the problem with their recommendations seriously. We received thousands of stories from people who were disturbed and harmed by content recommended by YouTube, including videos spreading homophobia, racism, conspiracy theories and health misinformation. They had no idea why they got these recommendations and how they could make disturbing recommendations stop.

Despite a strong moral stance against some of these ideas, our solution has never been to censor them. I watched the “Civil War Is Coming” video and I believe that there’s nothing in it that violates YouTube’s policies or warrants it being removed from the platform. However, as Mozilla fellow and leading disinformation researcher Renee DiResta has written, “free speech is not the same as free reach.” Even if this video is available on the platform, how and why did YouTube choose to actively promote it? It seems both irresponsible and counterproductive for YouTube to actively recommend this whilst, also, claiming to raise up authoritative election news to their users. And it leaves me wondering: how many of the 650,000 views on this video were a result of YouTube suggesting that people watch it? How many of the 7.1 million views of “Plandemic,” a covid conspiracy theory video that went viral in May, were driven by YouTube itself recommending this video to users through the Up Next panel before removing it from the platform two days later?

Even if this video is available on the platform, how and why did YouTube choose to actively promote it?

.

There is currently no way for us to answer these questions, nor verify YouTube’s answers to these questions, because YouTube does not make this information publicly available. In the absence of information about what the company recommends to users and why, we have no way to scrutinize or contextualize these anecdotes.

That’s why, later this month, Mozilla will release a browser extension that gives people a way to report their YouTube regrets. We’ll draw insights from our crowdsourced research and make that information available as a first step towards enabling transparency into recommendation algorithms. Our goal is to help all of us—including YouTube—understand how users feel about their recommendations and identify specific ways these systems can be improved. We know that until YouTube itself provides this data to independent researchers our picture will be incomplete, but the stakes are too high for us to sit by idly as misinformation and harm from recommendations spread throughout our communities at a much faster pace than it otherwise would.

Follow us on Twitter and Instagram to learn more about the Regrets Reporter extension when it launches on Thursday, Sept 17.


Verwandte Inhalte