Tell YouTube: Fix your feedback tools and give users real control over their video recommendations!

An image depicting a series of inputs feeding into a YouTube icon, projecting onto a screen

From conspiracy theories to propaganda, YouTube’s algorithm often promotes a minefield of controversial videos. Have you ever wondered why it keeps recommending the same clickbait content even after you click “dislike” or “do not recommend”?

So have we - and we looked into it.

With the help of 20,000 Mozilla supporters we studied YouTube’s feedback mechanisms – what we found out, published in the “Does This Button Work” report, is alarming:

  • Users are not in control of their experience on the video platform
  • Even after using the feedback mechanisms, YouTube’s algorithm often recommends unwanted videos

This is even more worrying in the context of the previous YouTube research we have done: In our 2021 “YouTube Regrets” study, we found that YouTube’s algorithm promotes videos containing misinformation, hate speech and violence.

A video recommender system that has been found to regularly recommend dangerous content and also does not consistently listen to user feedback desperately needs to be fixed.

The good news is: YouTube can solve this problem - if they want to. It’s time for the company to fix its feedback tools and put people in the driver’s seat.

Sign our petition now and help us put pressure on YouTube.

Inapakia fomu...