Stop us if you’ve heard this one before: the apps you use and the algorithms that power them contain bias. This affects your social media feeds, your search results and even your dating prospects.

Your online meet-cute may feel as spontaneous as an in-person one, but algorithms on Tinder and Hinge are the invisible hand behind the scenes that make matches happen. It’s possible that the apps prioritize profits over partnerships, but what’s for certain is that these algorithms are influenced by human bias.

“I think lots of users of color have an idea or a hunch that they’re being siloed or that they are only being shown to certain types of people,” says Apryl Williams. Apryl is a senior Mozilla fellow, professor at University of Michigan and author of the book Not My Type: Automating Sexual Racism in Online Dating. “That hunch is supported by data from a lot of companies that use ethnicity or economic status or education as a datapoint for pairing. Or at least they claim to. It’s really difficult because we can’t quite see what’s in their code, but we do know what our experience is.”

Apps Designed To Prioritize White Users’ Experience

When Apryl, a Black woman, met her spouse, a white man, on Tinder, he noted that the app insisted on sending him profiles of blonde women. Social apps use techniques like collaborative filtering to guess what a user will like based on what similar users are interested in. In the case of Apryl’s husband, however, the app insisted on serving him blonde women despite his continual swiping left. “Many of these apps assume that the American western ideal type is a blonde-haired, blue-eyed woman,” says Apryl. “So to keep drawing users in, they’re going to make sure that people they believe want that type have access to a steady stream of those users.”

There’s a lot we don’t know about what goes on inside our dating apps and how they work. We do have some information though. “Often apps are using what’s called a relevancy score or a relevancy assessment, so they’re looking for relevance between users,” says Apryl. “The specifics about that are hidden, we can’t exactly tell what they’re doing. But we do know that, to an extent, they’re looking for cues and social context that would tell them that these two users would go together. Often these apps are designed for white people by white people and so the things that they are putting into the product like relevancy scores are part of the mass mainstream white western ideal.” This bias combines with hidden scores within dating apps to form a dangerous combo — greatly multiplying the effects of bias on the platform.

How Does Trust and Safety Relate To Dating Apps?

Finding a partner on a dating app becomes exponentially tougher if users can’t trust in the safety of an app. Unfortunately, many modern matchmaking apps fail at this too. “Companies have started to include race and transphobia in the protections of community guides, but there aren’t clear guidelines on how people would report a transphobic incident, or an incident of racial harassment or an incident of racial fetishization,” says Apryl. “By not having a clear path for that, it really discourages users from reporting those claims in moments of anxiety or stress or fear because it isn’t clear that users would feel supported.”

Digital trust and safety in the dating sphere has seen its ups and downs. For example, in February, Tinder, announced a new feature targeted at users aged 18-25 to spot misbehavior, specifically in regards to “authenticity, respectfulness, and inclusiveness,” as well as a new feature to improve user verification. Before this, however, Tinder and other Match group apps put a pause on user background checks.

Experts like Apryl applaud Match Group’s efforts in sourcing its advisory board, with members that specialize in childrens’ health and sex trafficking. Still, Apryl worries that the company’s safety efforts aren’t as thorough for users of color and many more.

Are Dating Apps Doing Enough To Protect Users Of Color?

Research shows us that people are biased and many are okay with bringing those biases to their dating app experience. Unfortunately, in many cases algorithms worsen things. “The bias we experience online isn’t different from the bias we experience in other areas of life, it’s just amplified by algorithms,” says Apryl. “We see things that make us think twice about why it is [we’re] seeing this person on the app or matching with this person.”

Dating apps, like many social apps, tend to center the white experience — which often leaves users of color in the dust. In the case of dating apps, however, the choice hasn’t helped anyone. “Trust and safety is designed for and caters to protecting white women,” says Apryl, “and even the best we’ve seen with trust and safety still fails to adequately protect white women. So if it fails to protect the target demographic, I don’t even want to think about how it’s failing the rest of us.”

Are Dating Apps Racist? Here’s What Tinder And Others Can Do To Protect Users Of Color

Written By: Xavier Harding

Edited By: Audrey Hingle, Kevin Zawacki, Lindsay Dearlove

Art By: Shannon Zepeda

Relatearre ynhâld