Tarcizio


This is a profile of Tarcizio Silva, a Mozilla Fellow in the Tech and Society Fellowship program.


Mozilla Fellow Tarcizio Silva is from the Bahia region of Brazil, which he describes as one of the most politically left-leaning states in the country. Even so, Bahia was one of the first states in Brazil to deploy facial recognition and surveillance programs in public spaces, with the intent of cracking down on crime — a major campaign pledge from right-wing Brazilian President Jair Bolsonaro when he was elected in 2018.

Silva, an expert in algorithmic bias, has spent years studying artificial intelligence systems like the facial recognition and surveillance programs used in Bahia, researching the ways they contribute to the perpetuation of structural racism. Generally, “algorithmic bias” refers to the well-documented phenomenon of artificial intelligence systems making decisions that are systematically unfair to people of color, women, people with disabilities, and other historically marginalized groups. This kind of discrimination is baked into technologies that were created overwhelmingly by white people and that use data sets in which white people are over-represented.

Silva strongly prefers the term “algorithmic racism” over “algorithmic bias” in relation to his work. He defines “algorithmic racism” as when social media, apps, and AI reproduce or exacerbate racism in society, and says it more accurately encompasses the problems with the technology itself; the data we feed into the technology; and how deeply society relies on and trusts the technology. “The problems I’m denouncing are related to how structural racism is connected with the idea of using technology for mediating everything,” says Silva, whose 2022 book, Racismo Algorítmico, examines artificial intelligence and discrimination in digital networks.

One high-profile case of algorithmic racism in Brazil’s facial recognition systems grabbed headlines earlier this year. In January 2022, Black Panther actor Michael B. Jordan, who is Black, showed up on the Brazilian police’s most-wanted list when a facial recognition program incorrectly identified him as a murder suspect in a mass shooting because the program’s technology is extremely poor at distinguishing between Black faces. But that single incident involving the celebrity doesn’t fully capture the widespread damage these technologies do to millions of people of color in Brazil, where Silva says the programs remain popular despite their flaws and failures.

“I don’t think facial recognition should be deployed in public spaces,” Silva says. “Not just because it fails a lot, but because it can actually become a tool for incarcerating people.” As of 2021, there were more than 750,000 people incarcerated in Brazil, making it home to the world’s third-largest prison population. Roughly 56 percent of Brazilians identify as Black, but 67 percent of the prison population is Black or mixed-race (“Pardo”). Silva has seen estimates that say about 28% of prisoners in Brazil are provisionally awaiting judgment and due process, so flawed facial recognition programs that lead to wrongful arrests could also lead to people being wrongly imprisoned with little recourse.

I don’t think facial recognition should be deployed in public spaces. Not just because it fails a lot, but because it can actually become a tool for incarcerating people.

Tarcizio Silva, Mozilla Fellow

Silva acknowledges that the deep — and worsening — social and economic inequality in Brazil has led to spikes in thefts and robberies, leading some people to accept infringements on their rights in exchange for feeling safer. “There are some people who think that more state violence is the solution,” he says. But he points out that while algorithmic racism is bad for everyone, its worst and most immediate consequences fall far more heavily on the shoulders of people and communities of color than on white people.

Nonetheless, Silva sees optimistic signs for change in public perceptions surrounding algorithmic racism in Brazil. For instance, a bill currently under debate in the National Congress, which Silva thinks could pass, would establish a council on internet transparency and accountability. And recently, like-minded activists have successfully lobbied for restrictions on the use of facial recognition technologies in public places, creating momentum for possible further legal reforms.

Educating younger Brazilians about the realities of algorithmic racism and how it can affect their daily lives is at the heart of Silva’s Tech and Society fellowship at Mozilla; this program embeds technologists within civil society organizations across the Global South. The fellowship program is supported by the Ford Foundation. In partnership with the non-profit Ação Educativa, Silva is soliciting proposals from teachers across Brazil on lesson plans for students ages 6 - 17. The plans will help advance racial equality in digital technology and fill the gaps in students’ knowledge about the relationship between tech and structural racism. Silva believes integrating the plans into curricula across these age ranges will ultimately “promote the ability of Brazilians to understand and act to defend their human rights.” From this process, 35 lesson plans will be selected to be free online so teachers nationwide can access and share the resources.

Silva recounts his own experiences years ago as a Master’s student in Media Studies at a recognized university in his home state of Bahia — home to one of the largest populations of Afro-descent in Brazil. There, he realized that there were very few works by Black scholars and intellectuals in his coursework. Inspired by W.E.B. After Du Bois's work, he started an editorial project called “Desvelar” (“to unveil”) featuring the writings of young people of color on technology-related topics. “I think that’s been my biggest challenge — this perceived lack of reference,” Silva says, pointing to the dearth of works by people of color that he encountered in his university years. “Because it’s not that we don’t have references — it’s that they’ve been invisiblized.”

Another project of Silva’s during his fellowship involves a survey of 113 Afro-Brazilian technology specialists in which respondents were asked about their greatest challenges as people of color working in tech in Brazil. Silva was surprised to hear that the main problem they cited was the feeling that Black people in Brazil are simply not heard when they talk about technology. “It was a small report, but it was interesting to understand how producing, editing and curating knowledge by Afro-Brazilians is so important for promoting equality and justice in education and technology in this county,” Silva says.