Mozilla, Edelson stress that Facebook’s self-imposed transparency measures are failing

Both urge lawmakers to mandate platform transparency and independent researcher protections

(New York, NY | September 27, 2021) -- In Congressional testimony on Tuesday, September 28, New York University researcher Laura Edelson will urge Congress to adopt policies that would compel internet platforms to disclose data on advertisements and provide legal protections to researchers and journalists investigating them. The policies were co-developed with Mozilla and the Knight First Amendment Institute at Columbia University.

In August, Facebook suspended the accounts of Edelson and her colleagues and shut down their NYU Ad Observatory investigation into online advertising and targeting, making further research much more difficult and sending a chilling message to other researchers. NYU Ad Observatory uses donated, anonymous data from consenting users to help reveal trends in online advertising and targeting on Facebook.

Edelson is co-director of NYU’s Cybersecurity for Democracy project and will appear before the Subcommittee on Investigations and Oversight Tuesday, September 28 at 10am ET. The hearing is titled “ The Disinformation Black Box: Researching Social Media Data.”

Says Edelson: “Every day that my team cannot access the Facebook data we need to do our research, we fall further behind in the race to find answers. Online misinformation has created an enormous public health crisis, and we need rigorous science based on data on how to deal with it — just like we did with Big Tobacco, industrial pollution, and drunk driving.”

Edelson continues: “Facebook’s behavior is anti-science and anti-progress. It’s clear Facebook can’t be trusted to provide the transparency we need to move forward — and so we need lawmakers to compel them to disclose the data.”

Facebook’s behavior is anti-science and anti-progress. It’s clear Facebook can’t be trusted to provide the transparency we need to move forward — and so we need lawmakers to compel them to disclose the data.

Laura Edelson, NYU researcher

Indeed, Edelson’s testimony follows recent revelations that Facebook has extensively studied its own platform, yet failed to act even after learning it was harmful to users, including teens.

In her testimony, Edelson will cite three specific policies lawmakers should pursue to make platform transparency a reality. The policies were developed in collaboration with Mozilla and, in the case of the second policy, also in collaboration with the Knight First Amendment Institute at Columbia University.

  1. Require platforms to provide universal advertising transparency. Data should include who is paying, how much, targeting, impressions, reach, and other specifications and must be made available publicly without restrictions.
  2. Establish a “safe harbor,” or legal protections for researchers and journalists to investigate the operations of platforms as long as they handle data responsibly and adhere to professional and ethical standards. Safe harbor would not grant access to data, but rather help clarify the legality of ongoing investigations that currently exists in limbo.
  3. Require platforms to make public data public. All public content with meaningful reach should be made available via public tools or searchable interfaces. The biggest social media platforms already make public data available to their biggest advertisers, so it’s time they gave the public access to the same data.

Says Marshall Erwin, Chief Security Officer at Mozilla: "Transparency is the first, unescapable step toward holding social media platforms accountable for harmful outcomes. Without insights into what people experience, what ads are presented to them and why, what content is recommended to them and why, we cannot begin to understand how misinformation spreads.”

Erwin continues: "Mozilla built Rally, the data donation platform, because we believe that users should determine who benefits from their data — and because understanding what is happening on the internet is the first step towards fixing it. Businesses built on people's data should not profit off your data and then, when you decide to give it away, say that 'it's against the rules.' Platforms should not be scared of showing us what that data is used for."