The second cohort of the Mozilla Technology Fund will fuel projects building much-needed auditing, evaluation, and accountability mechanisms for AI
(SAN FRANCISCO, CA | TUESDAY, SEPTEMBER 6, 2022) -- Mozilla is seeking people and projects who are using open-source tools to audit AI systems — and awarding up to $50,000 each to those doing so.
The next cohort of the Mozilla Technology Fund will support projects which are providing tools and resources to auditors who are working to create greater accountability for AI systems. Winning projects will help fuel the growing AI transparency space, and also contribute to the learnings and community of the Open Source Audit Tooling (OAT) Initiative, a project developed and led by Mozilla Fellow Deb Raji.
Applications will be open from 6 September to 5 October. Learn more about the criteria, and then apply, here.
Says Mehan Jayasuriya, Senior Program Officer at Mozilla: “We’re excited to launch the second call for proposals for the Mozilla Technology Fund. We’ve worked closely with researchers and Mozilla Fellows like Deb Raji and Abeba Birhane to identify auditing tools as a fast-evolving, high-impact segment of the ecosystem that is currently underfunded. Our goal is to double-down on our investments in tools which can help identify the bias in and increase the transparency of AI systems.”
Our goal is to double-down on our investments in tools which can help identify the bias in and increase the transparency of AI systems.
Mehan Jayasuriya, Senior Program Officer
Many industries — ranging from food and agriculture to the automotive industry — are tightly regulated and subject to inspection and external auditing as a means to ensure safety and accountability. Yet while AI systems are increasingly being used to make determinations that impact people’s employment, health, finances and legal status, there are few mechanisms in place to ensure that these systems are being held accountable for harms, bias and discrimination.
Mozilla defines an “audit tool” as any resource that supports algorithmic analysis and inspection (including benchmarks/datasets, analysis tools, crowdsourcing tools, documentation templates, frameworks, etc.). These tools may support the assessment of expectations institutions have of themselves (e.g. internal auditing) as well as expectations others have of them (e.g. external auditing), at various stages of design and development. Learn more about AI audits here.
Ultimately, projects should help AI systems to better serve the interests of people, especially those disproportionately negatively impacted by algorithmic systems. They should also imagine new ways of building and training trustworthy AI systems in the future. For example, a winning project might entail a crowdsourcing tool to collect data about an online platform to allow for external inspection of a pricing or recommendation model. Or, an observatory tool that allows journalists to write stories about what content is promoted or suppressed on a social media platform.
The Mozilla Technology Fund launched in 2022 with the aim of holding AI designers accountable, reducing bias and increasing transparency. The first cohort of winners included projects in South Africa, Japan, the Netherlands, and beyond.
Press contact: Kevin Zawacki | [email protected]