October 18, 2024
A tenant screening company will face a lawsuit from consumer advocacy organizations in the Washington D.C. Superior Court. According to the suit, the company provided inaccurate screening information to landlords, potentially resulting in denied housing to qualified renters.
The screening company had a contract with the housing authority for the city of Washington, D.C., since 2018. According to the contract, the company provided information on recipients of housing vouchers and tenant screening services for other landlords nationwide. This practice has significantly impacted housing opportunities for people applying for subsidized housing, such as those relying on Section 8 for housing, and violated the FCRA.
However, the plaintiffs in this lawsuit claim that these reports use low-quality and potentially inaccurate data for tenant screening.
For example, the plaintiffs mentioned how the company confused individuals with similar names and used court proceedings for unrelated individuals. Furthermore, these reports contained information seven years or older, violating the Fair Credit Reporting Act’s (FCRA) provisions. The plaintiffs also believed that the vendor knew about these quality issues but continued generating reports without correctly acquiring the information.
As such, the plaintiffs argued that the defendant failed to create appropriate manual fact-checking procedures to identify potential mistakes. Specifically, the complaint stated that the defendant “failed to implement standard artificial intelligence (‘AI’) risk management practices to mitigate the known risk of errors and biases in its Service, yet it continued to market its Service and related appeals process as effective means for evaluating rental applicants under FCRA and ‘all other applicable laws and regulations.'”
In essence, the plaintiffs claimed that the defendant used AI to make these reports, which led to harmful inaccuracies and significant bias. Furthermore, the company allegedly neglected to manage the well-known potential for AI to generate incorrect information and actively chose not to correct the issue. The plaintiffs stressed how such failures violated the FCRA.
This lack of quality control measures led to landlords receiving inaccurate reports, including information about arrests or eviction records unrelated to housing applicants. Such mixups in information have caused qualified applicants to lose housing opportunities. Though some cases may have provided accurate data, the plaintiffs complained of another issue: The screening company’s scoring algorithms allegedly reflected racial biases.
According to this lawsuit, landlords often depended on the “yes or no” assessment provided in these reports. As such, they rarely examined the report’s information before deciding. This practice means inaccurate data could have significantly impacted an applicant’s eligibility. This lawsuit is one of many filed against tenant screening providers using AI and other automated systems to generate reports.
Disclaimer:
Information provided here is for educational and informational purposes only and should not constitute as legal advice. We recommend you contact your own legal counsel for any questions regarding your specific practices and compliance with applicable laws.