The MozFest Trustworthy AI working group invites expert AI Builders to work alongside highly engaged Civil Society Actors in envisioning a more equitable automated future. Since 2020, we have incubated bold ideas from the MozFest community within our volunteer network of working group projects and sought paths for them across other programming at the Mozilla Foundation.
MozFest TAI Working Group Alums receive Mozilla Tech Fund Awards
Mozilla launched the Mozilla Technology Fund (MTF) last year in an effort to contribute to a healthier internet and more Trustworthy AI. We are so excited to announce that two of the five Mozilla Tech Fund projects for 2022 emerged from the MozFest Building Trustworthy AI working group.
We are incredibly proud to have the Bias and Transparency in AI Awards from the MTF provide up to $50,000 USD each to these projects and nurture them with expertise over the next 12 months.
Algowritten | UK (MozFest TAI Working Group Cohort 1)
Algowritten is a project that is aimed at addressing AI bias in creative content. This project is by Naromass, an organization that seeks to change the current ways in which “inequality is systematized through algorithmic processes” in cultural and political spaces. With Algowritten, Naromass hopes to identify bias in written texts. The first type of bias to be examined will be sexism. A person will be able to sign up to Algowritten, run text through the program and it will give an analysis on how the text could be harmful and sexist.
Fair EVA | South Africa (MozFest TAI Working Group Cohort 2)
Fair EVA’s mission is to ensure that voice technologies work reliably for all users. As part of that mission, the organization is building a tool called SVEva Fair, intended to be “an audit tool, dataset and knowledge base to evaluate bias in voice biometrics”. SVEva Fair will be an open source library and dataset that developers can use to test for bias in their speaker verification models.
MozFest TAI Working Group Projects show up in record numbers at MozFest 2022
The Mozilla Festival is where most of these projects open up to the public for the first time. We are so excited for our alumni projects that have now received the funding they need to reach new heights, and look forward to the feats our other working group projects will achieve once our current cycle is complete in April. For now, you can see their progression at MozFest 2022. A dozen working group projects will be showcased, and you don’t want to miss any of them.
You can meet the leads of most of the AI Builder & Civil Society Actor Working Group Projects at this part-demo, part-networking event at MozFest 2022.
AI Builder Sessions at MozFest 2022:
We think of AI Builders as people who create Trustworthy AI and machine learning (ML) products, like data scientists, developers, engineers, and product managers.
- Visualizing Internet Subcultures on Social Media available March 7th - 11th
- Developing a Trustworthy AI Scorecard for Mozilla Data-Powered Products on March 7th
- Low-Resource Languages, and their Open-Source AI/ML Solutions through a Radical Empathy Lens on March 9th
- Fair Voice: What Happens When Your Voice Signal Becomes Your ID? on March 10th
- Youth for a Safer Internet: How Youth can Shape Artificial Intelligence for Online Safety on March 10th
- Truth of Waste as a Public Good at the MozFest Science Fair on March 11th
- Algowritten: Genre Bias in AI-generated text on March 11th
Civil Society Actor Sessions at MozFest 2022:
Launched in 2021, we think of the Civil Society Actors stream being made up of people outside government and industry working in their local and global communities through art, journalism, policy-making, research, scholarship, activism and technical literacy efforts, who care about the impact of artificial intelligence.
- AI Governance in Africa on March 7th
- A Feminist Dictionary in AI on March 7th
- Accountability CaseLabs: ShotSpotter on March 8th
- Trustworthy AI Education Toolkit on March 8th
- QUAIK: An AI app for healthy people on March 9th
- Spelman College Presents: An Intersectional Discussion on AI Mediated Microaggressions on March 10th
- Insights from the Accountability Case Labs Project on March 11th
Want to interact with these projects and our incredible community?