An illustrative image of a circuit


The EU’s Digital Services Act aims to reinforce transparency and accountability of digital services from hosting services to online platforms. Above all it will set new standards for very large online platforms and search engines, or VLOPs and VLOSEs, with over 45 million users across the EU. The regulation entered into force in November 2022, kicking off the implementation phase and further regulatory initiatives (delegated and implementing acts, guidelines, and codes of conducts) that spell out procedural aspects and more clearly define elements of the law (this is kind of like ‘rulemaking’ in the US). The DSA will become fully applicable on 17 February 2024, but certain rules will apply sooner to the VLOPs and VLOSEs.

Now the European Commission is working on a delegated act to establish the procedure for conducting annual independent audits of VLOPs and VLOSEs (Article 37). The prospect of finally being able to systematically audit the largest social media platforms is exciting (yes, audits are exciting!) to many civil society organizations advocating for greater platform accountability. Much of big tech’s efforts to assess their online harms have been voluntary, and under their own terms. For example, Meta infamously refused to release the full results of the human rights impact assessment it commissioned in 2019, in relation to the spread of hate speech and incitement to violence in India. Many hope those days are numbered.

Article 37 DSA

The DSA has a specific provision for third party auditing which refers ‘narrowly’ to independent audits conducted by professional auditors: once per year, VLOPs and VLOSEs must commission and pay for a third-party audit to assess their compliance with their commitments under the regulation. Stipulations to be clarified in the secondary legislation will aim to prevent conflicts of interest and to ensure expertise, competence, objectivity, and professional ethics.

These audits must evaluate platforms’ compliance with their due diligence commitments to transparency and safety (their Chapter III commitments), and the commitments they made in any codes of conduct they take part in, including required crisis protocols. The sum of these aspects is substantial, ranging from transparency reporting to notice and action mechanisms to user experience obligations, and more.

The auditors will submit a report that is ‘positive’, ‘positive with comments’ or ‘negative’. A not positive report will include recommendations on how to achieve compliance and a timeframe to do so. The delegated act will specify rules for the performance of the audits: the article refers specifically to “procedural steps, auditing methodologies and reporting templates” (Article 37(7)).

A layered approach

The DSA can be seen as offering a layered model of oversight, of which independent audits are one key piece in a puzzle of important due diligence obligations. Other essential and interlinked pieces are risk assessments and mitigation measures (Article 34 - 35). They oblige platforms to conduct self-assessments of their ‘systemic risks’ and put forward and implement measures to address them. Platforms will also be mandated to share datasets with vetted researchers to further scrutinize these systemic risks (Article 40). And of course the regulation provides for inspection by regulatory authorities (Chapter IV, Section 4). The European Centre for Algorithmic Transparency (ECAT) has been established by the European Commission to support it with technical expertise.

These oversight layers will interact and should not be viewed in silos. For instance, the self-assessments released by platforms and the research outputs by vetted researchers will be crucial for independent auditors. The importance of each piece in a due diligence puzzle and especially their mutual relationship holds the promise to yield meaningful accountability of VLOPs.


Different due diligence tools

It’s important to bear in mind that there are a variety of due diligence tools in this space. Exactly what the Commission will prescribe remains to be detailed.

Many civil society organizations advocated for Human Rights Impact Assessments (HRIAs) in the DSA, comprehensive evaluations of the effects of businesses’ activities on all human rights. Best practices for HRIAs require meaningful stakeholder engagement and can take years to complete. Meanwhile, various techniques for auditing algorithms, AI systems, and data sets have become more common methods for tech accountability. The ecosystem of ‘adversarial’ algorithmic auditors is growing and proving a powerful force, thanks to the contributions of Mozilla Fellows Deb Raji, Abebe Birhane, the Open Source Audit Tooling Project, tracking.exposed and other similar projects.

Impact assessments and algorithmic audits are different - though certainly complementary - due diligence tools. Impact assessments are usually conducted ex ante, to assess prospective impacts; and algorithmic audits are usually conducted ex post, to assess outcomes against their targets. HRIAs are much more comprehensive processes in terms of who they engage and what they assess, which is relevant given that online platforms are more than just their algorithms.

That said, the line between ex ante and ex post analyses might get blurry when it comes to platform auditing, and this might be a good thing; risk mitigation should be informed by previous assessments. The cyclical nature of the auditing and assessment layers is important.


Audits versus audits

The fields of impact assessments for tech platforms and adversarial algorithmic auditing are relatively new, but auditing has existed for many years in other sectors, for instance in finance or in environmental compliance and IT security.

Looking back at the DSA, it’s particularly useful to distinguish between ‘adversarial’ and ‘cooperative’ audits. The DSA’s independent audit requirement is specifically for a cooperative audit, commissioned and paid for by the company (yes, you read that right). Still, many elements of this provision as outlined in the DSA text are extremely promising. That’s why this delegated act is one of the more important ones to civil society advocates and experts who are keen to assure the effectiveness of the DSA.

Finally, audits in the DSA should not be confused with the audits being debated in the EU AI Act, which refer to conformity assessment procedures for ‘high risk’ AI systems, meant to certify the systems' compliance with the AI Act's requirements.

So, what’s promising from the DSA Auditing provision

  • Implement recommendations or face fines: In case the audit report is not ‘positive’, the platform must either implement the recommendations addressed to it (Article 37(6)) or take alternative actions to address non-compliance. Failure to adequately mitigate risks could imply an infringement of the regulation and lead to fines (Article 74).

And what’s… more worrying (at least right now)

  • Insufficient transparency: The first audits will be conducted following platforms internal risk assessments. It’s possible that nothing from the risk assessments or the audit reports will be made public until December 2024 (at that point, in a version redacted of certain confidential information).
  • Requirements for independence: The fact that platforms pay for audits isn’t inherently bad. Limitations on auditors on the number of the audits they can conduct for the same platform and the services they can provide to the company, along with the requirement that fees must not be contingent on the result of the report, are designed to prevent conflict of interest. But there is much here for the secondary legislation to stipulate to ensure auditor independence.
  • A rubber stamp exercise? Given the breadth of the obligations they must assess, auditors will have extremely difficult decisions to make about where to focus their efforts. It would be unfortunate if the independent audit became simply a rubber stamp exercise, signing off on the platform’s self-reporting. It’s important that auditors avoid a streetlight effect, only looking where platforms have already cast some light.
  • Competence and diversity of auditors: These audits will be highly complex, technical, and wide-ranging. Platform auditing is a new field and the skills and resources needed to perform them may be difficult to come by. While established firms (like the ‘big four’ accounting firms) may have the resources to conduct platform audits, they are less likely to have experience with fundamental rights questions and platform accountability questions developed by civil society groups and researchers over recent years.
  • Civil society engagement: The DSA doesn’t specify how civil society and impacted groups will be engaged in the audit process, even though their insights are critical. Recital 92 of the DSA mentions that auditors should make use of studies by vetted researchers and other objective sources. The implication of civil society, impacted groups, and researchers will be essential to overcome the streetlight effect of auditors’ over-reliance on platform self-reporting. Ensuring a strong data access regime and linkages between auditors and the research community will be important to the success of this article.

Next Steps

Work on this delegated act has begun already. The act will likely be published for consultation in February or March 2023 and enter into force in September 2023. When it is released it will appear on this European Commission webpage. If the shape and effectiveness of this platform auditing regime is an important issue to you, watch this space!