New research from Mozilla Fellow Emmi Bevensee examines how P2P technologies are being leveraged to spread toxic content — and how some are pushing back.

Decentralized and open-source technologies play an outsized role in keeping the internet healthy. But like any technology, they can also be harnessed by bad actors — and put to use making the internet a less healthy, more dangerous place.

Today, Mozilla Fellow Emmi Bevensee is publishing a new research into just this: How hate groups in the U.S. are using Peer-2-Peer (P2P) technologies to spread disinformation, amplify toxic content, and incite violence. The report, titled “The Decentralized Web of Hate” is an investigation into the online tools and tactics used by the modern white supremacist movement and will be published by the Rebellious Data LLC consulting firm that Emmi and their team are launching to do social good data science. The report also examines possible solutions to current problems.

Says Emmi: “As major internet platforms like Twitter and YouTube crack down on hate groups, these online communities don’t just go away. Instead, there’s been an exodus to spaces that are more difficult to scrutinize and moderate, but still have the potential to reach a mass audience.”

“As a result, toxic and dangerous content continues to flourish online. And, it’s now happening in the decentralized spaces that could be havens from harassment for queer, trans, and PoC communities as well as social justice movements more broadly.”

As major internet platforms like Twitter and YouTube crack down on hate groups, these online communities don’t just go away.


Among Emmi’s key findings in the report:

Radicalization is becoming harder to address. Major platforms like YouTube use imperfect algorithms for both recommendations and automatic content moderation. They host communities that can misinform and radicalize impressionable users. “Radicalization” refers to pipelines where users are exposed to more extreme forms of racist ideologies and behaviors over time. Centralized approaches to moderation, such as a top-down moderation or safety team, don’t work on P2P technology because the technology itself relies on decentralizing authority. As more white supremacists continue to migrate to P2P technology, the risk that they organize violence through these tools also increases.

Modern hate is not as responsive to top-down deterrence. As many white supremacists themselves expand use of “leaderless” tactics, they are becoming more agile at routing around centralized approaches to thwart their efforts such as policy, automatic content moderation, or the arrests of “lone-wolf” attackers. The decentralization of white supremacist groups is being increasingly facilitated by irrepressible and encrypted P2P technology. As such, many methods from typical government systems and structures, such as legislation or surveillance, are proving less effective at the more modern threat landscape. Only a network can defeat a network.

There are emerging decentralized solutions. Certain P2P tools have introduced novel ideas for combating harmful content. Some platforms have unveiled user agreements and urged their communities to block support for problematic tools. Other platforms have introduced “abuse audits” to identify and mitigate potential threats to users. Because of the technical and social nature of the problems we face, our solutions must also be largely decentralized.

Decentralization helps to solve many problems, but also raises new challenges. P2P technologies can advance many of society’s greatest coordination problems, from public transportation and supply chains to positive social connectedness and collaboration. However, the challenges that they ask us to face don’t have easy solutions.

To read the full report, visit