Conversations about AI are no longer just about technology. They’re also about society.

Today, the AI systems in our everyday lives can perpetuate and amplify biases that have long existed offline: Recommendation algorithms promote racist messages. Facial recognition systems misidentify Black faces. And voice assistants like Alexa and Siri struggle to understand Black voices. As the AI in consumer technology grows more sophisticated and prevalent, problems like these will grow even more complex.

In August, Mozilla announced $260,000 in funding for Black artists who use art to spotlight how AI can reinforce — or disrupt — systems of oppression.

Today, Mozilla is announcing the eight winning projects.

The winners include an app that uses AI to predict police brutality; a stark visualization of the ways voice technology excludes Black voices; an animated film about afrofuturism; and more. Awardees hail from the U.S., the UK, and the Netherlands.

These projects will launch to the world beginning in spring 2021. Winners were selected by a panel of 10 judges composed of artists, technologists, and activists. Learn more about each awardee in the list below.

Says J. Bob Alotta, Mozilla VP of Global Programs: “Black artistry exists within a long-standing tradition of applying rigorous artistic analysis to the most compelling issues in society. With these awards, we’re eager to continue that tradition — and also to fuel the long-overdue interrogations of race and racism happening now in the U.S. and around the globe.”

The winners:

Melalogic

Melalogic | by Melalogic in the U.S. | @getmelalogic

Melalogic is an app that gives Black People a single source of skin health information from trusted professionals who look like them. Participating users can help to build the Black Skin Health AI Data Set which will provide them with the ability to submit a photo of a skin issue they may have and receive instantaneous feedback as to what it could be, and suggestions on how they should treat it. This new public dataset will be able to fuel AI-powered dermatology research and prediagnoses.

Future Wake

Future Wake | by Future Wake in The Netherlands | @futurewake

This interactive web app turns the discriminatory practice of predictive policing upside down, instead allowing citizens to predict police brutality. The project uses AI and a number of data sets to determine where, when, and how police brutality in the U.S. is most likely to occur. It also paints a vivid picture of the potential victim, from their skin color to their age, to put a human face on the prediction.

Afro Algorithms

Afro Algorithms | by Anatola Araba in the U.S. | @anatolaaraba

This 3D animated short film in the Afrofuturist genre explores the topics of AI and bias. In a distant future, an artificial intelligence named Aero is inaugurated as the world’s first AI ruler. But Aero soon learns that important worldviews are missing from her databank, including the experiences of the historically marginalized and oppressed. A slate of well-known Black artists lend their voices to the film, including Robin Quivers, Hoji Fortuna, and Ava Raiin.

Hope

Hope | by Tracey Bowen in the UK | @hopeimmersive, @controlrapp, and @onallee

This collection of immersive journalism and Afrofuturism artwork imagines two possible futures: one where bias and discrimination has been banished from technology, and one where it runs rampant. Through the experience, viewers learn how AI and data sets are vulnerable to bias, but also learn about the real-world, Black-led initiatives underway to counteract this bias.

Dark Matters

Dark Matters | by Johann Diedrick in the U.S. | @johanndiedrick

This interactive web experience spotlights the absence of Black speech in datasets that train voice assistants like Siri, Google Home, and Alexa. The project also reveals the exclusion and code switching that results. Through a three-dimensional visualization of major speech datasets, viewers come into contact with vacuums of space representing these data voids. Intertwined are narratives attesting to the resilient and resistive qualities of Black speech, suggesting how we might create more equitable futures.

Binary Calculations Are Inadequate to Assess Us

Binary Calculations Are Inadequate to Assess Us | by Stephanie Dinkins in the U.S. | @StephDink

This project seeks to provide more open, interactive, and compassionate alternatives to supplant the exclusionary algorithms and datasets that currently impact our everyday lives. It proposes a data commons approach where anyone can contribute to a training dataset, and anyone can then use that dataset to power AI. The project models this approach by creating an open, BIPOC-focused database that then uses AI to generate artwork.

Points of View

Points of View | by Alton Glass in the U.S. | @grximmersivelab

In this 5-part mixed-reality series, the protagonist is Cassius Moore, a Black man living in a hyperrealistic future. Cassius is on parole after serving two years in prison for biometric hacking. But as he seeks to start a new life, he is mercilessly surveilled by a weaponized, AI-powered parole drone. Along the way, viewers learn about the very real threats of predictive policing, surveillance, and AI systems that replicate human biases.

Black Arts + Culture: Generative Traditions with AI and Design in Carnival

Black Arts + Culture: Generative Traditions with AI and Design in Carnival | by Vernelle Noel at University of Florida, in the U.S. | @VernelleNoel

This project uses AI to remix Black histories of design, art, and dance. Archival images of dancing sculptures from the Trinidad Carnival are mixed with each other and images of Black dancers across the diaspora (Caribbean, USA, etc.) in a single dataset. A machine learning model then generates new designs based on this data. The resulting art will be presented via an online gallery and discussions.

Mozilla’s Creative Media Awards support people and projects on the front lines of the internet health movement — from creative technologists in Japan, to tech policy analysts in Uganda, to privacy activists in the U.S. Creative Media Awards are supported by the NetGain Partnership, a collaboration between Mozilla, Ford Foundation, Knight Foundation, MacArthur Foundation, and the Open Society Foundation. The goal of this philanthropic collaboration is to advance the public interest in the digital age.

Past Creative Media Awards have interrogated algorithmic bias in dating apps and human resources departments; they have spotlighted the dangers of deepfakes and emotion recognition technology; and they have raised awareness about filter bubbles and bots.