“Future Wake,” a Mozilla Creative Media Award, turns predictive policing on its head
Website uses AI trained on real law enforcement data to predict police killings, tell victims’ stories
(October 14, 2021) -- Today, a new project is turning the idea of AI-powered predictive policing on its head — and also serving as a stark reminder of the fatal police encounters that occur in the U.S., and their relationship with race.
“Future Wake” is a Mozilla Creative Media Award and interactive artwork that uses AI trained on real law enforcement data to predict future police killings. On the Future Wake website, viewers can visit five major U.S. cities, from Los Angeles to New York. Then, an AI system trained on the Fatal Encounters and Mapping Police Violence datasets predicts who in those cities is most likely to be killed by police; where they could be killed; and how they could be killed. These predictions are presented in startling detail: In each case, viewers see and hear from the fictional victim discussing their experience and eventual death.
Future Wake is created by two artist-technologists who wish to remain anonymous at this time. In a joint statement, they say: “Future Wake turns the application of predictive policing upside down. Rather than predicting crimes committed by the public, it focuses on future fatal encounters with the police. To predict future events, Future Wake uses historical data of past victims of police violence to predict where, when, who and how the next victim will die.”
The creators continue: “Rather than communicating the traumas of police brutality solely through data and statistics, we intend to connect viewers to depictions of these predicted future civilian-police encounters through human-driven storytelling.”
“Rather than communicating the traumas of police brutality solely through data and statistics, we intend to connect viewers to depictions of these predicted future civilian-police encounters through human-driven storytelling.”
For those who are moved by Future Wake and would like to contribute to this art project, the creators welcome voice donations to help tell the stories of future victims. Contact [email protected]
The Creative Media Awards are part of Mozilla’s mission to realize more trustworthy AI. The awards fuel the people and projects on the front lines of the internet health movement, from activists to documentary filmmakers to researchers.
The latest cohort of Awardees are all Black artists who spotlight how AI can reinforce — or disrupt — systems of oppression. The AI systems in our everyday lives can perpetuate and amplify biases that have long existed offline: Recommendation algorithms promote racist messages. Facial recognition systems misidentify Black faces. And voice assistants like Alexa and Siri struggle to understand Black voices. As the AI in consumer technology grows more sophisticated and prevalent, problems like these will grow even more complex.