Recent news and Congressional testimony highlights the real world impact of the harms of digital content – particularly when it comes to children and teens, non-white communities, women and religious minorities. Greater transparency from social media companies is a critical step to understand and effectively solve these problems.
Mozilla supports universal advertising transparency along with new, meaningful tools to enable transparency from all major social media companies.
- Social media platforms like Facebook are playing a key role in disinformation, discrimination, and online harms but we lack the tools to scrutinize – and thus effectively regulate them.
- Although there is bipartisan agreement on the need for advertising transparency, the existing advertising libraries are subject to the whims of the platforms and fail to provide the necessary information for meaningful research or scrutiny.
- Companies like Facebook have extensive internal research into the harm caused by their platforms, but refuse to offer researchers, journalists and civil society organizations access to their platforms.
Mozilla supports Executive or Congressional action to require social media platforms to provide universal transparency:
1. Companies should provide publicly available data that includes information about all ads on the platform, who is paying for ads, how much is spent, how they are targeted, how algorithms optimize for the "best ads,” and other specifications.
Mandated universal advertising transparency will ensure a better understanding about which paid content is currently or potentially promoting disinformation, is discriminatory, or leads to other harmful outcomes.
2. Establish a “safe harbor” for researchers and journalists. This would allow them to investigate the operations of social media platforms in the public interest, as long as they handle data responsibly and adhere to strict ethical standards. A safe harbor would not grant access to data, but clarify the legality of ongoing research into companies like Facebook.
Legislating a safe harbor would shift the power to decide what is in the public interest from the technology companies to trusted independent experts.
3. Require social media platforms to make high engagement data transparent. Social media posts that are publicly visible and have meaningful, organic reach should be made available for scrutiny via public tools developed with input from experts in a manner that protects individual privacy.
Laura Edelson, Jason Chuang, et al, “Universal Digital Ad Transparency,” 3 Aug 2021.
Alex Abdo, “Facebook is shaping public discourse. We need to understand how,” Knight 1st Amendment Center at Columbia University, Sept 18, 2021.
Horwitz et al, “The Facebook Files,” The Wall Street Journal, Sept. 2021.
Laura Edelson and Damon McCoy, “We Research Misinformation on Facebook. It Just Disabled Our Accounts,” The New York Times, Aug. 10, 2021.
Marshall Erwin, “Why Facebook’s claims about the Ad Observer are wrong,” Mozilla blog, Aug 4, 2021.
Ashley Boyd, “Beware Silicon Valley’s Latest Trend: Fake Transparency,” New York Times, Aug 8, 2021.