Dark Matters screenshot

Voice interface systems have become an integral part of our everyday lives. Whether it be the Alexa, Google Home or Siri in your personal life, or the voice recognition software used on calls to access your bank account: your voice holds power.

But what if the AI behind this technology was unable to recognise your voice? What if the data sets used to train the estimated 4.2 billion digital voice assistants worldwide used mostly American English samples and excluded voices that sound like yours? This is the reality of many Black people globally who - due to the absence of representory voices in training data sets - have to ‘code-switch’ in order to be understood by technology meant to ascend racial bias.

Johann Diedrick aims to change that. Today he launches his new project “Dark Matters” — an interactive web experience highlighting the absence of Black speech in training datasets.

> https://darkmatters.ml <

The Dark Matters interactive web experience consists of a 3D environment filled with static 3D model spectrograms that play audio spatialised speech recordings. As users navigate around, they’ll hear datasets like the ones currently used to train voice interface systems, but also will encounter the absence of Black speech — a dynamic black void representing Black voices often excluded from these datasets. The project is being showcased for the first time at Ars Electronica’s A Digital New Deal exhibition as part of Ars Electronica Gardens from September 8th to 12th, as well as in the in-person BIAS exhibition at the Science Gallery in Dublin starting on October 14th.

Johann Diedrick is a New York City-based artist with a focus on sonic encounter, and the recipient of a 2021 Mozilla Creative Media Award.

Inspired by the experience of his own Jamaican-born parents using Alexa, Diedrick explains: “Here we have a widespread and growing technology with an inbuilt, pernicious racial bias. It changes how we speak, flattening speech and cultural diversity and forcing Black speech into normative, standardized forms, as code switching for ‘white ears’ now becomes code switching for ‘AI ears’.”

"We have a widespread and growing technology with an inbuilt, pernicious racial bias. It changes how we speak, flattening speech and cultural diversity."

Johann Diedrick, Creative Media Awardee

Diedrick’s ambition is to engage machine learning and AI engineers to ensure that the full varied spectrum of Black speech is better served by AI-powered voice interface technologies. You can find out more at the Dark Matters website here.

Mozilla’s Creative Media Awards are part of our mission to realize more trustworthy AI. The awards fuel the people and projects on the front lines of the internet health movement — from activists to documentary filmmakers to researchers.

The latest cohort of Awardees are all Black artists who spotlight how AI can reinforce — or disrupt — systems of oppression. The AI systems in our everyday lives can perpetuate and amplify biases that have long existed offline: Recommendation algorithms promote racist messages. Facial recognition systems misidentify Black faces. And voice assistants like Alexa and Siri struggle to understand Black voices. As the AI in consumer technology grows more sophisticated and prevalent, problems like these will grow even more complex.

Learn more about upcoming Creative Media Award projects.