This is a profile of Countering Tenant Screening, a Mozilla Technology Fund awardee

As any renter knows, finding an apartment can often be an odyssey, from fierce competition to high rents. And in many cases, renters also face discrimination based on race, gender, and other demographic traits.

Now this fraught landscape is encountering a new trend: automated tenant checks. These AI systems collect data from across the web and use it to develop scores about which renters are “trustworthy” and which are not.

More and more landlords are using this technology when accepting or rejecting tenants. But what may at first appear to be a tactic for efficient property management is frequently unfair and wrong.

“There is a long history of housing discrimination and segregation in the U.S.,” explains Wonyoung So, an urban technologist and Ph.D. candidate in Urban Science at MIT. “With that perspective, it’s really crucial to understand the technology being used in the housing industry.”

There is a long history of housing discrimination and segregation in the U.S. With that perspective, it’s really crucial to understand the technology being used in the housing industry.

Wonyoung So

The learnings so far have been bleak: “Tenant screenings are prime examples of technology using seemingly neutral data to justify discrimination against different races or genders,” So says. Indeed, So notes that Black women with children are disproportionately affected by these screening services.

This is a trend that’s become common in the world of AI: Purportedly impartial technology that actually amplifies centuries-old biases. Or put simply, “automating inequality,” So says.

For example, tenant screening algorithms rely heavily on past eviction reports from third-party data brokers. “It’s alarming, because sometimes eviction reports are filed even if a landlord has no intention of evicting tenants,” So says — the report might just be a tactic to collect rent. But often tenant screening reports omit such contexts and landlords tend to deny tenants who have any kinds of eviction records, even if the tenant wins the case.

“These systems are basically blacklisting tenants,” So adds.

Meanwhile, these technologies face little to no regulation. There are no laws requiring companies to disclose their screening algorithms. And the algorithms’ proprietary nature means oversight and accountability are nearly impossible.

For this reason, So isn’t just studying the phenomenon — he’s actively trying to fix it. He is the creator of Countering Tenant Screening, an initiative that scrutinizes how these algorithms work. He’s developing tools that allow tenants to request information about the algorithms and their decisions. Countering Tenant Screening is also a grantee of the 2023 Mozilla Technology Fund, a cohort of eight projects auditing AI systems. The Mozilla Technology Fund supports open-source technologists solving pressing internet health issues, like algorithmic bias and AI opacity.

So’s project is in the public interest, but it also requires the public’s participation: Countering Tenant Screening relies on crowdsourced data to succeed. So is collecting the experiences and the screening reports of tenants who interact with these algorithms. The goal: gain a window into this opaque space.

So is optimistic there is a critical mass of people who can fuel this work. He cites community groups like tenant organizations, and also legal aids fighting against evictions.

Wonyoung So was drawn to this space through his interest in urban planning, which he’s currently pursuing a doctorate in at MIT. In addition to studying tenant screening apps, he’s also investigated other “proptech” like eviction apps and smart homes.

“A lot of these technologies are using data collection to perpetuate housing inequality and undermine housing affordability at large,” he says. Countering Tenant Screening is a first step toward fixing that.