Showing posts with label National Fair Housing Alliance. Show all posts
Showing posts with label National Fair Housing Alliance. Show all posts

Friday, September 29, 2023

Fed Regulator Warns AI & Machine Learning Can Worsen Lending Bias

AI & Automated Decisions Determine Credit Rating, Loan Terms, Hiring, Housing, etc. 

“While these technologies have enormous potential, they also carry risks of violating fair lending laws and perpetuating the very disparities that they have the potential to address,” the Fed’s vice chair of supervision, Michael Barr, said at the recent National Fair Housing Alliance (NFHA) 2023 national conference. While new artificial intelligence tools could cheaply expand credit to more people, machine learning and AI may also worsen bias or inaccuracies inherent in data used to train the systems or make inaccurate predictions.

As concerns grow over increasingly powerful artificial intelligence systems like ChatGPT, the nation’s financial watchdog says it’s working to ensure that companies follow the law when they’re using AI. Automated systems and algorithms help determine credit ratings, loan terms, bank account fees, and other aspects of our financial lives. AI also affects hiring, housing and working conditions.

In the past year, the Consumer Finance Protection Bureau said it has fined banks over mismanaged automated systems that resulted in wrongful home foreclosures, car repossessions, and lost benefit payments, after the institutions relied on new technology and faulty algorithms.

One problem is transparency. Under the Fair Credit Reporting Act and Equal Credit Opportunity Act, for example, financial providers legally must explain any adverse credit decision. Those regulations likewise apply to decisions made about housing and employment. Where AI make decisions in ways that are too opaque to explain, regulators say the algorithms shouldn’t be used. “I think there was a sense that, ‘Oh, let’s just give it to the robots and there will be no more discrimination,’” Chopra said. “I think the learning is that that actually isn’t true at all. In some ways the bias is built into the data.”

EEOC Chair Charlotte Burrows said there will be enforcement against AI hiring technology that screens out job applicants with disabilities, for example, as well as so-called “bossware” that illegally surveils workers. The Fed recently announced two policy initiatives to address appraisal discrimination in mortgage transactions. Under the proposed rule, institutions that engage in certain credit decisions would be required to adopt policies, practices, and control systems that guarantee a “high level of confidence” in automated estimates and protect against data manipulation.

In April, 2023, about one-fourth of federal agencies, including the Federal Trade Commission and the Department of Justice, announced their commitment to cracking down on automated systems that cause harmful business practices.

Sam Altman, the head of OpenAI, which makes ChatGPT, said government intervention “will be critical to mitigate the risks of increasingly powerful” AI systems, suggesting the formation of a U.S. or global agency to license and regulate the technology. The Electronic Privacy Information Center said the agencies should do more to study and publish information on the relevant AI markets, how the industry is working, who the biggest players are, and how the information collected is being used - like regulators have done with previous new consumer finance products and technologies.

*****

Read the July 19, 2023 The Hill article.

Read the June 15, 2023 Federal Times article.