AI can be hard to wrap one’s head around. Software we can’t see and don’t have a say in makes decisions for us about our lives — and sometimes we don’t even know it. What decisions are being made for me and who’s keeping it all in check? That’s what we hope to explain. This is Breaking Bias.
Picture this: you’re an artist who’s decided to take on a project that involves pictures of hands. (”Hands?,” you say. Yes, stick with us.) You hop on Google and run a search for the term “hand.” Nothing complex or convoluted, just “hand.” The results? All white hands with barely a brown, beige or Black palm in sight. How does something like this happen? Is this racist? Is it on purpose? Who’s at fault?
This scenario isn’t made up. Back in 2015, graphic designer Johanna Burai ran into this exact problem. Her Google image search led to a furrowed brow and the birth of a new project: World White Web. Her site points out how many of Google’s search results, like many sites on the web, presume white internet users to be the default. Because of inadvertent bias, a search for the term “hand” was really a search for the term “white hand.”
Burai never intended to create World White Web, but she felt the problem needed to be called out. “This all started with a project meant to map the human senses,” says Burai. “I wanted there to be proper representation in this project but as I scrolled through the image results for ‘hand’ I was a bit shocked — I didn’t find a single Black hand.” Burai says it wasn’t until she specifically typed in the term “Black hand” that she’d see anything other than hands belonging to white people. “Even then most of what popped up were vector images and pictograms of Black hand illustrations.” So not pictures of human hands, basically.
Traditionally, those of us not employed by Google know little about how it teaches its software to see race. Lately the company’s added a bit more color around the issue. In June, reports showed Google planning to move away from the Fitzpatrick skin type (FST) scale that many tech companies use to relying on a more inclusive alternative expected to recognize a wider variety of skin tones. But Google’s problems with race aren’t limited to images.
Researchers have found other ways Google can be racially biased. Most famously, Dr. Safiya Noble in her book Algorithms Of Oppression shows how searches for terms related to Black girls mostly led to porn. There’s also the time Google Photos sorted pictures of Black people into an album labeled “gorillas” or when a search for “three Black teenagers” yielded very different results compared to a search for “three white teenagers.” And then there was the criminal ad issue, where searching a person’s name could prompt Google’s ad partners to suggest a criminal record search for that name. This used to happen a lot with names like Darnell and Jermaine and less with names like Geoffrey and Dustin.
Not a good look, Google.
Google isn’t alone though. Algorithms created by other companies have exhibited unintentional bias as well. On TikTok, go to a user’s page and suggestions of who you should follow next weirdly surface more people of the same ethnicity and hair color. Or on Twitter, where the service’s image cropping algorithm would unfairly prioritize white people over Black people when using AI to crop a photo. AI-driven racial bias exists offline too. Take, for example, how health care algorithms can be less likely to offer Black patients the care they need or how online mortgage lenders often provide loans to Black and Latino borrowers at higher interest rates.
It’s no surprise that many of the biases found within AI reflect the biases held by larger society. These biases persist, in part, due to how we train artificial intelligence. If the information we use to teach software is biased, it could lead to the software producing biased results. That’s how we get to a point where Black girl searches lead to porn or searches for Black names surface criminal record ads.
The ads issue goes even deeper. In April, The Markup showed us how Google allowed YouTube creators to earn money against videos related to keywords like “white power” and “white lives matter,” yet prevented them from earning ad dollars from keywords like “Black power” or “Black Lives Matter.”
The report led to action. After the story broke, racial-justice organization Color Of Change published a petition demanding that Google open up about its company’s relationship to race, from the ad keywords it monetizes (and doesn’t monetize) to the company’s hiring practices.
“We’re calling for Google to do a comprehensive evaluation of their products, practices, policies and personnel decisions to look for bias,” says Johnny Mathias, deputy senior campaign director at Color Of Change. The Markup’s reporting has uncovered how Google’s ads API has blocked terms “Black lives matter” but allowed terms like “all lives matter” — meaning content creators associated with the former are unable to make money on their content while creators associated with the latter can.
Google responded to Color Of Change’s audit request saying the company “welcome[s] feedback from Color Of Change.” The nonprofit has also requested audits from Facebook and Airbnb in the past. Both Facebook and Airbnb responded to the requests with commitments to measure and address harm on the platform. Google hasn’t responded to Color of Change’s request yet.
Google didn’t create the biases being reinforced in its — and other companies’ — apps. But when it comes to its own software it puts out into the world, Google does have a responsibility to counter bias where it can. “Just because the harm is a result of 1’s and 0’s doesn’t mean it’s not real,” says Mathias. For companies like Google, its global reach makes it even more crucial that unintentional bias is proactively addressed. “It can’t be the job of academics and civil rights groups to ensure Google isn’t creating harm,” says Mathias. “You’d never allow a lawnmower company to say, ‘it’s the job of researchers to make sure our lawnmower doesn’t cut off people’s hands.’”
Mathias reminds us that with all that we know, there may be more to the picture. “We’re only aware of the surface level problems those of us on the outside of Google can see,” says Mathias. “We simply don’t know what we don’t know, which is why we need Google to conduct a racial equity audit.” Now that’s something the entire world wide web could appreciate.
Written By Xavier Harding
Edited By Anil Kanji, Anna Jay, Ashley Boyd, Carys Afoko
Art By Sabrina Ng, Nancy Tran
(Racial identifiers in this post are styled in accordance with AP Style)