On December 18th of last year, the European Commission made an important announcement: they were launching a formal investigation into X to assess whether X may have breached multiple provisions in the new Digital Services Act (DSA), a sweeping piece of internet regulation focused on transparency and accountability. The announcement represented the first formal proceedings launched by the European Commission towards enforcing the DSA.

Among the issues the Commission said they were investigating is X’s compliance with Article 40.12. Sometimes referred to as the “CrowdTangle” provision, this article is a transparency requirement specifically focused on requiring platforms to provide real-time access to public data.

The truth is that while there are a lot of different transparency provisions built into the Digital Services Act, Article 40.12 is unique.

For one, it’s one of the few provisions that isn’t awaiting a delegated act. Delegated acts are formal instruments designed to add additional details to a law but they also sometimes take months, and even years, to finalize. While X has more time to come into compliance with some of the other transparency provisions, Article 40.12 is already law.

Secondly, this provision has some important short-term implications that could have a profound impact on democracies around the world. In the coming year, we’re going to witness more elections than at any point in human history and one of the things we’ve learned about the internet over the last decade is that making it easy to monitor large platforms in real-time is one of the most effective ways we can empower civil society to help play a role in protecting elections. As the CEO of CrowdTangle, my team worked on dozens of elections from Nigeria to Sri Lanka to the United States and we saw first-hand how vital this data is to journalists, civil society groups, and election protection efforts.

We also know that policies are only as effective as their enforcement and that’s why this moment is so critical. The early messages the Commission sends around how seriously (or not) they are going to enforce the DSA will play an outsize role in its long-term success or failure and will have a direct impact on the world’s ability to monitor elections this year.

The early messages the Commission sends around how seriously (or not) they are going to enforce the DSA will play an outsize role in its long-term success or failure and will have a direct impact on the world’s ability to monitor elections this year.

Brandon Silverman, Mozilla Advisor


Those realities make 40.12 one of the most urgent provisions in the DSA as we head into 2024. Getting this one right and getting it right quickly matters.

Unfortunately, based on all the information we have available, we think that the European Commission has every right to be concerned that X isn’t in compliance. Let’s walk through why.

  1. First, it’s worth taking a moment to acknowledge that X has spent the better part of the last year actively making it harder for researchers to study their platform. In March, they eliminated their free researcher API and laid off the teams managing it. In the process, they killed thousands of research projects and largely shut the book on the ability for researchers to study their platform. They also laid off the teams and closed down the programs that had been focused on supporting academic research.
  2. After nearly a year of making it harder for researchers to study X, on November 4th, X seemed to begin the process of coming into compliance with Article 40.12, albeit very quietly. They published an application form for a researcher API and updated their developer terms of service. Strangely, though, they didn’t announce this new change or proactively tell their previous researchers or the broader researcher community about the new application, let alone walk them through all the details of the new program. That’s in stark contrast to TikTok, Meta and others who have been working to get into compliance and making high-profile announcements, actively partnering with the research community in their development, demo’ing their new solutions at integrity-related conferences, and more.
  3. We also know the application they made available is a very unusual one. The application asks some questions that go beyond traditional industry norms for researcher vetting for public data on social media, including questions about “indirect” funding sources of applicants and detailed information about an organization's board members and directors. It’s possible those are valid questions from X’s point of view but they’re not ones normally used for vetting access to public data.
  4. Unfortunately, their efforts seemed to stop there. There is zero public information about who/if anyone has gotten access and who has been denied (and why), what the turnaround time is for the application process, what sort of data they get access to, what the terms and conditions of that access is, the rate limits of the API itself, how reliable or comprehensive the data is, what level of support is provided, etc. Needless to say, meaningful compliance with Article 40.12 requires much more than simply publishing an application form in the dead of night and not announcing it to anyone. And unfortunately, the concerns don’t stop there.
  5. Not only is there very little evidence of compliance that goes beyond the launch of a form but as of January 29th, the only anecdotal data we have suggests that, in fact, they haven’t actually given anyone access to any data. There are multiple public threads of well-known & reputable researchers who have studied social media data for years sharing stories of getting their applications rejected. While more information is needed, I haven’t been able to find anyone who has been granted access. X's discussion board is full of people complaining about basic failures in the process, including simply getting communication back from X.
  6. One possibility is that X is working hard behind the scenes to build the infrastructure and tooling necessary for the program. However, we know this isn’t the case. X has the ability to make real-time access to public data available because in 2023, they didn’t shut down all their data APIs, they only turned them off for academics. Commercial partners still access and use real-time public data every day.
  7. Finally, it’s important to note that we know this work and this data matters. Researchers have been flagging repeated examples of harms proliferating on X over the last year, including misinformation and disinformation flooding X around the Israel/Hamas conflict as well as other risk areas laid out in the DSA. It sometimes feels like every week brings a new example of harmful viral content spreading on the platform that public interest monitoring could help mitigate. We also know that hundreds of researchers, if not thousands, have lost the ability to monitor what's happening on X because of their new pricing changes around their APIs. As we approach a year in which there are a historic number of elections, you could make the case that this data couldn’t be more crucial in the history of social media.
  8. In the interest of fairness, it’s also worth asking how X’s effort compares to other Very Large Online Platforms (VLOPs) and their current compliance. The truth is that some VLOPs are even less transparent than X, and other platforms have robust programs that they’re actively demoing to the research community. However, in my opinion, none of the platforms are in full compliance at the moment and that’s why holding a high bar for X is so critical. It’s important to make it clear that the Commission expects compliance and will hold platforms accountable. To that end, they should provide guiding principles to spell out what it means for platforms to provide meaningful access to real-time data.

In conclusion, there’s no public evidence that X’s current efforts go beyond simply making a form available.

If their efforts are more significant than that, it is on X to demonstrate how they are providing access to public data in ways that are usable for researchers in a meaningful way, including spelling out who has received access (and who has been denied), providing communication & feedback in reasonable timeframes, documenting the data available, and much more. X needs to provide far more information that their efforts go beyond what’s publicly available if they want to make the case that they’re providing real transparency into public content on their platform.

However, it’s worth noting that, at worst, X’s current efforts could be a cynical attempt to meet the letter of the law but avoid the actual work of providing real, meaningful transparency into their system. Letting X get by with such an effort would not only undermine some of the most meaningful transparency requirements built into the DSA and set a terrible precedent for what sort of compliance the Commission expects from other platforms, but it would also put at risk one of the most important sources of data available for journalists and civil society to help protect elections around the world as we head into a historic year.

Brandon Silverman is the former CEO of CrowdTangle, a social monitoring tool that was used by thousands of civil society groups around the world to monitor public content on social media. He left CrowdTangle in 2021 and has been advising policy-makers on how to make public data from large platforms accessible in responsible ways, including helping with Article 40.12. He’s a Knight Fellow at the George Washington Institute for Data, Democracy and Politics, a Founding Fellow at the Integrity Institute, and an advisor to the Mozilla Foundation.


Relatearre ynhâld