Kwanele App
The Kwanele App was built in an open-source AI collaborative

Building open source AI collaboratives with diverse global communities can diversify the spectrum of people shaping ethical AI.

In a landscape where Big Tech's algorithmic dominance continues to grow more opaque, open AI collaboratives offer much-needed transparency. The model is a viable way to democratize power, diversify the people shaping AI, and embed ethical considerations into AI product development. Among Big Tech companies, Apple leads acquisitions with over 29 AI startup acquisitions since 2010, followed by Google with 15 acquisitions. Microsoft follows with 13 acquisitions, Meta with 12, and Amazon with 7. With so much of the access to AI talent and funding now in the control of Big Tech, open AI collaboratives present a much needed alternative for ethical AI builders to thrive.

An ‘open AI collaborative’ is a collective of individuals who bring both their expertise and experience living and working with algorithms together in service of a shared goal.

An ‘open AI collaborative’ is a collective of individuals who bring both their expertise and experience living and working with algorithms together in service of a shared goal.

The MozFest Trustworthy AI working group (TAIWG) is an example of such a collective. The working group includes AI builders, developers, artists, civil society leaders, policy-makers, and funders who work in self-organizing projects that push towards a more equitable automated future for all. The group’s focus is on:

  • establishing best practices in key areas of Trustworthy AI.
  • including more diverse stakeholders involved in building AI.
  • developing new technologies as building blocks for developers.

Since the MozFest TAI working group was launched in 2020, the community has stewarded 24 projects from feminist AI to Black data futures, internet subcultures, and youth safety online. With members representing 56 countries and project leads from diverse academic backgrounds, the TAIWG is an embodiment of open source principles in practice.

Case Study: Kwanele App

Imagine an AI chatbot that is part paralegal, simplifying the legalese within government statutes, part crisis response, sending rescue in moments of imminent danger, part social worker, supporting you in reporting gender-based violence, and all with the gentle approach of a trauma-informed talk therapist. Kwanele, a South African-based non-profit, joined the MozFest Trustworthy AI working group to access the ethical AI talent needed to achieve their version of a more equitable automated future.

GOAL: Build an AI chatbot that helps survivors of gender-based violence, including women and children, report and successfully prosecute crimes perpetrated against them.

TOOLS: Using Large Language Models (LLMs), the chatbot will respond to questions related to local legislation including the Protection from Harassment Act and the Criminal Law Sexual Offences and Related Matters Amendment Act.

METHODOLOGY: Informed by Critical Feminist Interventions and user research conducted with the target demographic for the app, the working group held a series of virtual workshops with global and local stakeholders with backgrounds in data privacy, policy, digital rights, law, software development, product design, AI/ML, and women’s advocacy to name a few. The idea was to bring divergent expertise together to help the Kwanele app development team embed algorithmic transparency and other ethical considerations into their practice.

RESULT: The workshops and mentoring within the working group allowed the Kwanele team to create an ethical AI strategy that would help close the gap between reducing algorithmic harms in theory vs practice. They also received tactical support in building the AI chatbot leveraging GPT-3.

The open AI collaborative is an example of what the future of technology development might look like. For highly engaged users who are dissatisfied with the status quo, open AI collaboratives present a way to highlight issues in the products they use daily, the companies that own their data, and the industries that profit off their algorithmic dependence, while productively working towards an alternative solution. It then creates resourced spaces where these alternative products, like the Kwanele app, can deliver real-world impact, while reducing algorithmic harms for already marginalized communities.

Interested in joining the MozFest TAI Working group? Register here.