December 2, 2024
A new legal settlement will block specific tenant screening tools, preventing the company from using AI-generated scores when assessing applicants who use housing vouchers. This decision follows a U.S. District Judge’s approval for $2.3 million in a settlement to resolve allegations of discrimination.
The AI-generated discrimination explicitly impacted Black, Hispanic, and voucher-using applicants. This settlement results from a 2022 class action lawsuit in Massachusetts; plaintiffs accused a prominent tenant screening company of violating state law and the Fair Housing Act (FHA) by unfairly discriminating against these groups through its scoring system. The company’s algorithm considered factors like credit history and non-rental debts to produce a proprietary score for landlords to evaluate potential tenants.
The plaintiffs argued against the opaque system and how it disproportionately assigned low scores to Black and Hispanic applicants. The lawsuit also found that anyone using housing vouchers suffered similar discrimination, leading to unfair denials of rental opportunities for many marginal communities. Furthermore, the plaintiffs claimed the AI scoring system violated the FHA’s anti-discrimination laws. These violations include failing to provide transparency, failure to justify the scores, and relying on factors not directly linked to rental payments, such as credit history.
The tenant screening company agreed to stop displaying AI-generated scores for renters using vouchers nationwide as part of the settlement. This change would eliminate the appearance of AI scores in its model. Furthermore, the company cannot recommend whether landlords should accept or deny applicants using housing vouchers. As such, landlords must evaluate these applicants based on their complete rental history, not the company’s AI-generated scores.
The settlement also created a fund for Massachusetts renters who faced denials for housing due to the scoring system. Despite the settlement, the company maintained that its practices remained legally compliant. Regardless, they settled to avoid the high costs and distractions of continued litigation.
This case highlights broader concerns about using AI tools in tenant screening. Any landlord screening applicants should ensure they obey state laws and the FHA. As such, they should review and adjust their screening tools to avoid AI screenings that lead to discrimination when denying housing opportunities. Failure to prevent denials based on race or other protected factors could violate housing laws and lead to lawsuits.
Though AI tools promise the possibility of faster and more accurate screenings, discrimination can still happen. The law guarantees renters a fair screening process unimpaired by race or voucher use. As such, maintaining an equitable and legal tenant screening process takes priority.