In the process of implementing the Digital Services Act, the European Commission has opened two public consultations on topics relevant to platform watchdogs and algorithmic accountability experts. The DSA is the EU’s new regulatory framework to reinforce the transparency and accountability of digital services, with particular focus on a subset of online platforms with over 45 million monthly active users in the EU. The first batch of these Very Large Online Platforms and Very Large Online Search Engines (VLOP/SEs) was officially named in April, starting a four month clock for them to comply with obligations related to empowering and protecting users, providing fair and robust content moderation, and assessing and mitigating their systemic risks.

The DSA deal is not quite done. Key procedural aspects are still being elaborated, and the central European regulator and national regulators in each EU country are getting established and preparing for enforcement. There are two non-legislative initiatives to supplement the legislation which are particularly relevant to the platform watchdog/algorithmic accountability community: one on independent auditing and one on data access.

Data access and auditing are closely connected, because the DSA offers a layered model of oversight:

  • First, platforms self-assess their systemic risks (Articles 34 & 35 - this process has started for the 19 recently designated VLOP/SEs).
  • They then commission an external party - the independent auditor or auditors (plural) - to validate their own assessment, along with their compliance with their other obligations, which range from adequate content moderation capacity, to increased transparency of recommender system and advertising, to the protection of minors.
  • On top of these independent audits, platforms must grant regulators and vetted researchers access to data to assess their compliance and further investigate their systemic risks.

If you are a researcher, a civil society watchdog, an algorithmic expert, an adversarial auditor, or generally someone uncovering platform and algorithmic harm, you might be wondering how your expertise will fit into this new oversight puzzle. Many of the same tools and methods are useful for research and auditing, for example data acquisition via a permissioned access mechanism such as an API, through crowdsourcing, or through automated collection like scraping.

The DSA’s data access and audit regimes seem keen to draw from the cross-disciplinary pool of experts who have been scrutinising the fairness, accountability, and transparency of socio-technical systems for years. Many future DSA researchers and auditors may have cut their teeth investigating the same problems, such as dis and misinformation, or algorithmic amplification and bias.

Where you participate might ultimately depend on your business model or entity structure:
Researcher access (Article 40) is for research organisations and in some cases nonprofits or public interest researchers that meet certain criteria, including independence of commercial interests. Independent auditors (Article 37) will enter into formal contracts with the companies they audit, who pay for these audits. Auditors are also subject to rules for conflict of interest and professional ethics. The audits they will conduct are - or at least they should be - so expansive that they’ll require enormous resources, skill sets, and time [1]. The organisations that end up performing these audits may well specialise and commit themselves exclusively to providing this service.

The draft of the delegated act reveals an auditing regime inspired by the financial sector but with significant twists. They look to be a mix of a financial sector style “model validation” (the system functions as intended, ie: the platform has released transparency reports) and a normative assessment (the quality is assessed, ie: their content moderation practices respect fundamental rights). These audits ultimately evaluate socio-technical systems, which requires a range of skills and perspectives, and also appropriate oversight and safeguards.

Are you comfortable coming up with benchmarks? This might also influence whether you want to enter the DSA auditing game. Auditors are asked to provide (and describe in their audit report) their own audit criteria for assessing compliance [2]. In the brave new world of the DSA, many of these criteria simply don’t exist yet; there may be no scientific or industry consensus, and there may even be academic debate about the usefulness of narrow criteria for evaluation.

Other things that may discourage a potential auditor: auditors are required to commit to a “reasonable level of assurance”, which is, in fact, a very high bar of certainty, especially when you consider the nuances and challenges of rigorously assessing these platforms.

So who will be these auditors, you might ask. Since the audits are somewhat modelled on the financial sector, the “big five” major financial auditors seem likely players, though they’d also be prevented from providing other services to the companies they’d audit. The draft delegated act specifies that a platform can hire multiple auditors: either they can contract with multiple auditing organisations, or the auditors can themselves subcontract to supplement their expertise. Theoretically this means that a smaller organisation with a particular offering could either be contracted by a larger auditor or join an auditing consortium.

Might we see the social and the technical questions separated out, then, to different actors? How might this affect the audit quality or the health of the auditing ecosystem? Whether this is even a viable option for smaller actors is unclear, since subcontracting would mean subjecting themselves to the auditor’s terms and associated restrictions, and joining a consortium would be a complex partnership with its own liabilities and professional limitations.

Auditing experts not wanting to accept these contractual agreements might be more inclined to become vetted researchers under the DSA’s Article 40, or to request access to publicly available data from platforms as a public interest researcher (Article 40.12). Or they may not request any data formally and instead contribute to accountability ‘from the outside’. The DSA will create plenty of new documentation for the wider public to review, for example in the form of transparency reports. Platforms will also be required to have public ad libraries, and the European Commission will maintain a public database documenting removed or actioned content decisions (Article 24.5).

Critically, though, how much of the audit reports will be public is still opaque. Three months after they receive them, platforms must make the reports public, but they have ample permission to redact them where they consider that publishing them “might result in the disclosure of confidential information of that provider or of the recipients of the service, cause significant vulnerabilities for the security of its service, undermine public security or harm recipients”. (Article 42.5)

The structure of the DSA relies on the research community and the public to ensure accountability, yet in practice, it seems to place far more trust in these contracted auditors than other stakeholders. (Granted, these auditors will likely sign on to more restrictive agreements in exchange). Interestingly, auditors are encouraged to analyse materials related to the audits of other platforms, which hints at a sectoral approach, and suggests that information in these public audit reports will be of value. But at the end of the day researchers will have much less access to information than auditors [3] .

Like the GDPR before it, the DSA will have market-altering effects. GDPR helped spur networks of privacy professionals, compliance tools, and privacy tech. The DSA audit requirements will create a market for platform auditors, PhD opportunities for researchers, and much more. It is important to set the best possible conditions for this new ecosystem. At least in the short term, these audits seem unlikely to be a silver bullet for accountability. Accountability experts can at least remain vigilant that they don’t become a rubber stamp or a tool for ethics washing.

You might not be sure what role you’ll play in the new world of the Digital Services Act. At the moment your insight is likely relevant to the elaboration of both the auditing and the data access systems. Note they are moving on slightly different timelines: we already have a draft of the further specifications on independent audits, along with draft templates for the auditors’ report and the platforms’ audit implementation report. Take a look and share your thoughts with the European Commission here before June 2nd. Meanwhile the Commission is holding an initial call for feedback in relation to the delegated act on data access until May 31, but will be consulting further over the coming months.

FOOTNOTES

[1] The draft delegated act reveals a focus on algorithmic auditing, which is critical, but less attention is paid to other complex elements, such as assessing user-friendliness or user-experience, or user-privacy.

[2] “The audit criteria used for assessing compliance with each audited obligation or commitment, and the materiality threshold tolerated and expressed in qualitative or quantitative terms, as appropriate.” Article 10.2 of the draft regulation.

[3] To concretise the sectoral approach and to guard against industry capture, vetted researchers should be able to scrutinise the audit reports in full. Arguably this already falls within the realm of a research request they could make under Article 40, since it relates to compliance with the independent auditing obligation itself.


Conteúdo relacionado