By Jon Lloyd | Feb. 5, 2020 | Advocacy
On February 4, Mozilla gave evidence to the UK’s House of Lords Select Committee on Democracy and Digital Technology about combating online disinformation.
Online disinformation is a complex issue. And yet yesterday, when Lord David Puttnam asked Mozilla what one thing the UK government should do to improve the regulation of political campaigns in a digital age, our answer was simple: transparency.
For more than a year, Mozilla has been researching and speaking out about the problematic relationship between digital advertising and online disinformation. In 2018, Mozilla participated in the high-level expert group on disinformation in Brussels. That same year, we signed the EU Code of Practice on Disinformation. And in 2019, we launched public advocacy campaigns urging online platforms to prioritize political advertising transparency.
There’s no silver bullet for solving online disinformation, but transparency is a necessary and powerful first dose of medicine. It’s a means to an end, helping us identify other key areas where changes to the online advertising model are needed. Mozilla believes that Facebook, Google and other platforms should be required to maintain thorough public ad archives, allowing users, researchers, and journalists alike to scrutinize ad content, targeting parameters (e.g. who advertisers intended to see an ad), and engagement metrics (e.g. who actually saw the ad and clicked on it). Further, these libraries shouldn’t be limited to political ads -- they should encompass all ads.
Disinformation spreads like wildfire with help from the digital ad ecosystem, manipulating and misleading millions of users. And so platforms have a responsibility to step up and take action. Last year, Mozilla and scores of researchers outlined for platforms what exactly an effective ad transparency archive should look like; read those guidelines here.
We stressed these beliefs to the Select Committee. At a fraught time online and off, we urge UK lawmakers to listen; to mandate that platforms take responsibility; and to hold platforms accountable if they fail.