May 24, 2024

Earlier in May, the U.S. Department of Housing and Urban Development (HUD) announced new Fair Housing Act (FHA) guidance. According to HUD’s Office of Fair Housing and Equal Opportunity (FHEO), it addresses making housing decisions when using generative artificial intelligence (AI). 

Why It Exists

This guidance could have a crucial impact on new automated screening (AI screening) tools used by landlords and consumer report providers. In the announcement, HUD explained that its actions answered President Joe Biden’s Executive Order, “which called on HUD to provide guidance to combat discrimination enabled by automated or algorithmic tools used to make decisions about access to housing and in other real estate-related transactions.”

HUD separated this announcement into two guidelines to address landlords’ use of AI-assisted tools. According to the Department, these documents would help housing providers and screening companies understand the updated best practices. For example, they explained how to use AI with applications for housing services and its interaction with FHA. The Fair Housing Act forbids landlords from discriminating against applicants based on protected statuses. These statuses include race, color, sex, disability, familial status, religion, and national origin.

AI’s Effect

Traditionally, housing providers would make decisions on potential applicants. Providers would consciously avoid possible biases that could influence the screening process to prevent FHA violations. However, the rapid rise of AI screening tools made it possible for landlords to discriminate against applicants unintentionally.

In many cases, AI and algorithms may perform tenant screening, an advertisement for openings, and more. As such, AI and algorithms may play a crucial role in screening decisions. Sometimes, third parties may use these tools. The housing provider maintains responsibility when third parties use them.

The Guidance’s Effect

HUD’s new guidance emphasizes the need for transparency by housing providers and screening vendors. It also stresses the importance of transparency when using AI technology to assist in the screening process. The Department further explained these points in the announcement: “The Fair Housing Act prohibits intentional housing discrimination and housing practices that have an unjustified discriminatory effect… use of third-party screening companies, including those that use artificial intelligence or other advanced technologies, must comply with the Fair Housing Act, and ensure that all housing applicants are given an equal opportunity to be evaluated on their own merit.”

The FHA guidance included “Guiding Principles for Non-Discriminatory Screenings” to guide screeners. These principles are:

  1. Choose only relevant screening criteria;
  2. Use only accurate records;
  3. Follow the appropriate screening policy;
  4. Be transparent about the screening process;
  5. Permit applicants to challenge any negative information;
  6. Design and test complex models for compliance with the Fair Housing Act.

This guidance should assist housing providers in ensuring compliance with Fair Housing Act standards.

Disclaimer:
Information provided here is for educational and informational purposes only and should not constitute as legal advice. We recommend you contact your own legal counsel for any questions regarding your specific practices and compliance with applicable laws.

Source