Making Trustworthy AI (TAI) real can be a daunting task, but the Civil Society Actors Working Group (CSAWG), convened by MozFest, is doing exactly that.

Since October 2021, eight groups of civil society leaders have been convening contributors from the broader MozFest TAI Working Groups community to help advance projects addressing local and global challenges in TAI.

As we begin the new year, here’s a quick update on each CSAWG project’s progress and a reminder of how to get involved!

The projects

A feminist dictionary in AI

A feminist dictionary in AI has been convening contributors to identify examples bias in AI, as well as ways to mitigate it, especially in terms of gender bias. Twenty contributors have joined so far. Their current focus is on drafting the main concepts and terms that should be included in the dictionary. You can sign up to join the project and also learn more about the dictionary and how to use it at MozFest.

Accountability Case Labs

Accountability Case Labs has been convening contributors to plan and deliver its first TAI accountability case lab workshop in January examining Twitter’s Bias Bounty Challenge. A debrief on lessons learned so far will follow at MozFest, as will another accountability case lab workshop. Pre-planning for the project’s second and third workshops is underway, as well. Over 20 contributors have joined so far from disciplines like data science, human rights law, AI auditing, social science, humanities, computer science, and civil society advocacy. If you’d like to get involved, you can join the mailing list or sign up for the next project meeting.

AI Governance in Africa

Since its start, AI Governance in Africa has held several meetings to discuss its strategy and how to proceed with its research on the state-of-the-art for existing AI governance structures in Africa. So far, 8 contributors have helped analyze the existing AI-related policies and strategies identified in the research. A draft of the analysis will be shared at MozFest. Learn more about the project here.

Audit of Delivery Platforms

Audit of Delivery Platforms is organizing a workshop to define the guiding principles and main features for its app that will ultimately audit the algorithms that delivery apps use to manage their workers. Check out project documentation to learn more.

Black Communities - Data Cooperatives

Since October, Black Communities - Data Cooperatives has held its first community event, started its newsletter, developed its governance and accountability models, and set up a community of interest company (a kind of social benefit organization) to support its work. So far, 15 contributors have helped develop the project in cooperation with local communities. You can contact the project leads at [email protected] to learn more and join Black Communities - Data Cooperatives at MozFest to play a new game designed to highlight issues of privacy and power raised by the use of AI in access to health services.

Harnessing the Civic Voice in AI Impact Assessment

Harnessing the Civic Voice in AI Impact Assessment has been engaging stakeholders such as civil society organizations and affected communities engaged in Human Rights Impact Assessments (HRIA) of AI systems. Currently, the project is following design-thinking to capture broadest ideals, existing learnings and lived experience and focus it on specific needs and context of co-creation with contributors and stakeholders. Outputs will include new research methodologies that acknowledge and include under-represented people and groups usually absent from discussions and a MozFest session focused on developing a guidance for including the public, especially civil society and impacted communities, in HRIAs for AI. Over 130 participants have participated so far. If you’d like to contribute to this project, as well, register here.

Spelman Blackpaper

The Spelman Blackpaper team is interviewing civil society actors regarding the impact of algorithmic bias on Black women. They’re also hosting a MozFest session called An Intersectional Discussion on AI Mediated Microaggressions that you can attend during the festival this March. Follow the MozFest community Slack for more updates on the Spelman Blackpaper as MozFest approaches.

Trustworthy AI Education Toolkit

Most recently, the Trustworthy AI Education Toolkit has been developing user personas, prepping outreach comms, and curating its resources with metadata to make it especially helpful for teachers. So far, 8 contributors have helped build the toolkit. You can check out the project and share feedback and questions at MozFest during a session focused on the importance of teaching students the ethical principles of AI. You can also join the project here beforehand.

Stay connected

There’s still plenty of time left to join a project and help make trustworthy AI real for you and your communities.

Join us for our last regular meeting ahead of MozFest by registering for our January CSAWG call.

You can also register for MozFest 2022 right now and check out all the working groups events and sessions in this year’s program. All of these projects will share their work at MozFest - join them there to learn more and find your own pathway to contributing.

To keep up with other events, projects, and programs like these, remember to subscribe to the MozFest newsletter and join our MozFest community Slack. We hope you’ll join us and these amazing projects at MozFest 2022!

A selfie of Chad Sansing, Program Manager, MozFest team, Mozilla Foundation

Chad Sansing works on leadership and training, as well as facilitator and newcomer support, for MozFest. When he’s not at work, you can find him gaming, reading, or enjoying time with his family. Prior to joining Mozilla, he taught middle school for 14 years.

MozFest is part art, tech and society convening, part maker festival, and the premiere gathering for activists in diverse global movements fighting for a more humane digital world. To learn more, visit www.mozillafestival.org.

Sign up for the MozFest newsletter here to stay up to date on the latest festival and internet health movement news.


Verwandte Inhalte