Friday, October 4, 2024

NCRC and Fintechs Urge Federal Regulators to Use AI to Detect and Eliminate Lending Discrimination

 

The National Community Reinvestment Coalition (NCRC) and a group of financial technology firms submitted a joint letter urging regulators issue clear guidelines to lenders on how the new AI fair lending tools could better evaluate disparities in lending. The letter to the Consumer Financial Protection Bureau (CFPB) and Federal Housing Finance Agency (FHFA) - signed by NCRC, Zest AI, Upstart, Stratyfy, and FairPlay - was issued in response to the White House’s Executive Order on AI in October, 2023.

Some lenders have not adopted these newer tools for underwriting analysis because they believe they can remain compliant with existing fair lending laws despite evidence that suggests older scoring models continue to contribute to systemic discrimination. Newer fair lending tools can allow lenders to conduct searches for new underwriting models that perform as well as older scoring models, while also mitigating the risk of discrimination in their analysis of an LMI credit applicant.

From the companies’ perspective, the power of the new AI tools can help lenders comply with regulations and improve their ability to expand credit access to applicants who have traditionally been underserved or considered too risky by old underwriting models.

The key recommendations of the letter include:

  1. Don’t wait for perfect information to act. AI will continue to rapidly evolve. Supervisory highlights should be used by regulators to highlight best practices within the industry.
  2. Provide written guidance on activity that triggers fair lending oversight. The CFPB should provide clearer guidelines on the conditions that would require a lender to engage in a Less Discriminatory Alternative (LDA) search, as well as the frequency with which such searches will be conducted.
  3. Clarify that fair lending applies not only to how applicants are treated, but also how they are selected. Evaluating the creditworthiness of applicants can happen at the earliest stages of the lending process, including during marketing campaign planning. AI tools that can more comprehensively assess the risk of an applicant should be adopted earlier and favored over older models and tools.
  4. The FHFA should continue to build upon its 2022 AI Advisory Opinions. The prior advisory opinions offered AI-specific guidance to the GSEs based on select use cases with potential to improve housing finance for consumers.
  5. The CFPB should assert that fair lending compliance should be as high a priority as all other parts of the lending process. For companies using AI in credit decisioning, the CFPB should make clear the usage of outdated tools is not sufficient to remain compliant with fair lending laws.
  6. Supervisory examination and training should address routine review of financial institutions’ model testing protocols and results. Fair lending examinations should also include reviews of the models used, testing protocols and positive assessment of LDA searches. Data concerning the efficacy of tools and practices should be shared in a forum with regulators and policymakers.

Photo by BoliviaInteligente on Unsplash

Read the September 30, 2024 NCRC article.