Facebook Groups pose a major threat in this election season.
They’ve become hidden breeding grounds for disinformation campaigns and organizing platforms for extremists. And Facebook’s own algorithmic recommendation engines actively grow these networks by promoting them to unsuspecting users – something the company has known since 2016.
With conspiracy theories, disinformation, and foreign influence running rampant in Facebook Groups, the company must turn off group recommendations until the U.S. election results are certified.
In recent days, the company acknowledged the role of Groups in spreading misinformation by discontinuing recommendations of health Groups to “prioritize connecting people with accurate health information.” While this is a good step, this isn’t a strategy - it’s a never-ending game of whack-a-mole with devastating consequences.
Facebook has known about this problem for years but ignored it, while extremism grew on the platform. In fact, the company began heavily promoting Groups for the last several years even though in 2016, researchers presented evidence to the company showing that “64% of all extremist group joins are due to [Facebook’s] recommendation tools…” in other words “[Facebook’s] recommendation systems grow the problem.”
Bad actors will use whichever Groups they can to plant disinformation, and algorithmic recommendations for Facebook users to join new Groups help grow a potential audience. With a critical election underway, Facebook must take this policy a step further and stop ALL group recommendations until election results are certified.