Hero Image

Mozilla’s Elections Casebook

Scrutinizing what steps platforms have taken to protect election integrity globally

Feb. 27, 2024
Becca Ricks
Odanga Madung
Open Source Research & Investigations

Written by Becca Ricks, Odanga Madung and Open Source Research & Investigations

Overview

In the landscape of global elections, the commitment of tech giants to safeguard electoral integrity remains in question. This is especially true for elections occurring outside of Europe and North America, in Global Majority countries. In our analysis of public-facing announcements and interventions announced by large tech platforms, we determined that platforms are consistently unprepared for election cycles and under-resource election interventions in the Global Majority.

Tech platforms tend to adopt a stakeholder-led rather than a platform-led approach, potentially overlooking the nuances of the electoral cycle and the specific vulnerabilities within their platforms. Frequently these decisions worsen an already vulnerable information ecosystem. Their interventions, unfortunately, tend to follow a "one size fits all" strategy, inadequately tailored to the diverse political and social contexts of different countries.

Further, there's a notable opacity in how these tech companies disclose the resources – both human and financial – committed to safeguarding electoral integrity. Significant staff reductions in teams dedicated to election integrity during 2022 and 2023 means even fewer resources are being dedicated to election cycles. Adding to the complexity is an overemphasis on public social media ecosystems like Meta’s, while neglecting the burgeoning role of messaging platforms like WhatsApp, which are increasingly mirroring social media in terms of their scope and reach.

Introducing Mozilla’s Elections Casebook

Researchers studying platforms outside of North America and the EU often face a distinct set of challenges, as tech platforms consistently fail to craft election policies and interventions that are uniquely suited to the cultural and political landscape in which they operate. Some of these countries have more fragile democratic institutions or saturated media environments, making them especially vulnerable to manipulation or abuse. As noted in new research, platforms like Meta tend to take a “copy and paste” approach to their policies for global elections, often failing to detect and process examples of networked hate speech, disinformation, and violence.

Complicating matters, researchers face significant barriers to collecting and accessing social media data. The Digital Services Act seeks to address this gap in the EU, but early indications suggest the DSA may not have much impact beyond Europe, as its data access possibilities are not scoped to include global risks beyond the EU. Without access to real-time, detailed platform data, researchers studying elections in their respective countries must rely on alternative methods for monitoring social media platforms during elections.

Our Elections Casebook features three case studies written by researchers who are studying platform dynamics in their respective countries: Brazil, India, and Liberia. The intent of this project is to better understand some of the ways platforms struggle to develop meaningful policies and interventions around elections, especially when faced with complex information environments, languages, and cultural contexts. These case studies highlight some of the ways in which platforms are consistently underprepared for safeguarding elections across the Global Majority.

Brazil Case Study

The Role of WhatsApp and Telegram on the Attacks Against Electoral Integrity and the Threats to Democracy

Read the study

India Case Study

Party Politics and WhatsApp Pramukhs: Messaging Platforms and Electoral Integrity in India

Read the study

Liberia Case Study

Platform Interventions in a Low-Trust Election Environment

Read the study


Patterns and themes across the case studies


The three case studies we are publishing paint a complex picture of how platforms handled general elections in India in 2019, Brazil in 2022, and Liberia in 2023, respectively. Some common themes we saw:

1. Political groups developed new tactics to target political messages on closed messaging platforms WhatsApp and Telegram, taking advantage of the platforms’ broadcasting features to reach huge audiences.

According to the research, political groups developed innovative tactics that leveraged the distinct affordances of WhatsApp and Telegram in order to target audiences online. Although WhatsApp is fundamentally a one-to-one messaging app, it also allows users to post in public and “private” WhatsApp groups that can be as large as 1024 members, a number that has consistently risen over the years. Over the past few years, WhatsApp has continued to integrate broadcast and public communications features into its app, while effectively evading public scrutiny.

According to Divij Joshi, political parties and interest groups in India have been developing techniques to “micro-target” voters since 2014, and have continued to refine their approach over the past decade. Political groups will obtain personal data from data brokers (names, addresses, phone numbers, religion, ethnicity, caste, etc) in order to create an “audience segment.” Then, they will create targeted WhatsApp groups and send them political messages tailored to that audience. In his analysis of the 2019 Indian General Elections, Joshi argues that as electoral propaganda has become professionalized, WhatsApp has become a key platform for political groups to disseminate hate speech and disinformation during elections.

2. Researchers had no way to monitor public groups on closed messaging apps like WhatsApp and Telegram.

Messaging apps like WhatsApp and Telegram are central players in the media ecosystems of many countries. During elections, these platforms play an outsized role in how political groups build community and mobilize audiences online. However, these apps are difficult to scrutinize because of technical limitations around how messages are encrypted by the platforms and because of how regulators define what constitutes public versus private interactions online.

During the post-election period in Brazil, conspiracy theories around the election proliferated in public groups on WhatsApp and Telegram, and researchers had zero visibility into what was happening. In their analysis of the Brazilian 2022 General Elections, Lorena Regattieri and Débora Salles argue that chat apps like WhatsApp and Telegram served as effective mass-broadcast tools for participatory disinformation during the post-election phase. In public groups, audiences amplified junk news sources and co-created conspiracy theories that called into question the integrity of the election. Meanwhile, researchers and the general public had no way to monitor the dissemination of election-related disinformation on the platforms. Solutions are needed that, critically, affirm people’s right to privacy by not weakening encryption on closed messaging apps while also providing more transparency.

3. Platforms tended to rely on fact-checking as a first defense, but those efforts didn’t reflect local contexts and languages.

Fact-checking is a core pillar of how large social media platforms like Facebook approach global elections. In countries where platforms tend to dedicate fewer resources and less attention, fact-checking is often effectively the only intervention that is taken to debunk conspiracy theories and moderate hate speech. However, platforms often struggle to find local partners who understand the language, cultural context, and political nuances of the country.

According to Eric Mugendi and Rabiu Alhassan, in Liberia, companies like Meta have a limited presence and do not have a local trusted partner for performing debunking and content moderation on the ground. During the 2023 Liberian General Election, Mugendi and Alhassan say that Meta relied on fact-checkers outside the country which meant that election-related information on Facebook was largely missing or of poor quality because it lacked local insights. Election monitoring groups in Liberia determined that Facebook posts with false claims proliferated during the election.

Recommendations from the case studies

There are a number of concrete steps platforms and policymakers can take to empower researchers and safeguard elections. One common theme included the need for regulatory frameworks that tackle closed messaging platforms like WhatsApp and Telegram, as well as limits on some of the unique broadcast features enabled by these platforms. Another theme was social media platforms allocating more resources towards linguistic and cultural inclusivity in their moderation efforts.

Here are some of the key recommendations that emerged from the case studies:

Recommendations for platforms

1. Closed messaging apps like WhatsApp and Telegram should develop distinct rules for broadcast communications versus those intended for limited circulation.

Closed messaging platforms blur the lines between social media and private messaging. In acknowledging the dual public/private nature of the platform, WhatsApp must develop different rules for mass broadcast communications versus small group messaging. For instance, argues Joshi, WhatsApp could put limits on ‘viral’ forwarding, including limiting how many forwards can be received by groups, limiting group size, or changing how many individuals are added to group accounts. Implementing some of these steps would curb the spread of election-related disinformation, propaganda, and hate speech.

2. WhatsApp should assess the risks of broadcast features before they are launched publicly.

Over the past few years, WhatsApp has rolled out new features in Communities, Groups, and Broadcast Lists, that increasingly allow for the broadcast of messages to huge audiences of up to 5,000 members. WhatsApp must acknowledge the ways in which they provide infrastructure for propaganda, disinformation and hate speech, especially during election periods, and develop processes for evaluating the risks of broadcast tools and affordances. New features should only be rolled out after extensive risk assessment and testing.

3. WhatsApp and Telegram should consider designing more friction into the sharing mechanism in public groups and channels.

For instance, according to Regattieri and Salles, when users attempt to share or forward content to public groups, they could be prompted to pause to reflect or verify the information. Platforms could very easily integrate more friction into the user experience for sharing, which could have a significant impact on how quickly election-related mis- and disinformation spreads.

4. Telegram needs to enforce its rules on political advertising by limiting how ads can be run on public channels, and providing greater ad transparency.

Telegram’s official policy prohibits the promotion of political ads. But according to Regattieri and Salles, the platform also has serious limitations around data accessibility and transparency. What’s more, Telegram’s advertising portal allows advertisers to run ads on public channels of 1,000+ members. This means that researchers are unable to assess how the platform is enforcing its own policy, or get access to data about political ads in these public channels.

5. Closed messaging platforms like WhatsApp should work towards sharing more data with researchers.

Because messages are encrypted, platforms themselves have limited visibility into what users are sharing in groups. However, social science research on closed messaging platforms is urgently needed and platforms could share more data with researchers that still maintains the privacy of their users. For instance, argues Joshi, WhatsApp could share metadata, other information it collects about groups, or even information about its internal moderation practices.

6. Social media platforms should allocate more resources to better understand the country’s online landscape and political context.

Platforms tend to overlook smaller countries during their elections, resulting in the development of inadequate tools and resources. Platforms should dedicate resources towards understanding the cultural context, media landscape, and consumption patterns in a particular country, argue Mugendi and Alhassan.

7. Social media platforms should allocate more resources towards language inclusive content moderation.

According to Mugendi and Alhassan, it is imperative that fact-checking guidelines like fact-sheets, fact-checks, and explainers are translated to reflect a country’s linguistic diversity so that everyone has equitable access to the information they need to perform more effective fact-checking.

Recommendations for policymakers

1. Policymakers should firmly commit to the right to privacy and agree not to undermine encryption on closed messaging apps like WhatsApp and Telegram.

A number of proposals for undermining encryption have been put forward recently, says Joshi, including client-side scanning of messages before they are sent. Implementing such proposals would undermine communication safety and would open up real possibilities of abuse. It is critical that policymakers do not violate people’s right to private communications by weakening encryption.

2. At the same time, policymakers urgently need to develop new governance and regulatory frameworks that address the distinct challenges of closed messaging platforms.

Closed messaging platforms like WhatsApp have continued to roll out new features that create opportunities for political groups to reach larger and larger audiences, without any kind of legal oversight or even internal governance mechanisms. While regulatory frameworks exist to govern public social media platforms like Facebook and Twitter, we don’t have legal frameworks that address the distinct challenges of closed messaging apps like WhatsApp. Rather than operating on their own or making voluntary commitments, platform interventions need to be guided by a legal framework, argues Joshi. Platforms should be bound to clear legal frameworks that allow election authorities to monitor whether or not the platform is complying with rules on political ad spending or communication through features like the WhatsApp Business API, for instance.

3. Policymakers should develop stricter privacy laws aimed at the broader data sharing ecosystem, which enables voter targeting.

In India, the microtargeting of political messages on WhatsApp is enabled by the harvesting of people’s personal and public data, says Joshi. Regulators could enforce stricter rules on the collection, sharing and use of personal data by political groups, including the use of ‘public’ voter lists.

Get involved by submitting a case study

Mozilla’s Elections Casebook is an ongoing project. To propose a contribution to our casebook, please email us at [email protected] with your short pitch. We are primarily looking to hear from researchers who study platform governance and elections in countries outside of North America/Europe. Contributors will be compensated.