Showing posts with label National Fair Housing Alliance. Show all posts
Showing posts with label National Fair Housing Alliance. Show all posts

Monday, September 23, 2024

Housing Discrimination Complaints in 2023 Continue to Increase Nationally

The national number of fair housing complaints rose to record numbers for the third year in a row. There were 34,150 fair housing complaints received in 2023, compared to 33,007 complaints in 2022, according to findings in the National Fair Housing Alliance (NFHA)'s 2024 Fair Housing Trends Report. There also was a sharp increase in the number of harassment complaints which jumped by 470.5% based on color and 114.9% on race.

The source of the data were 86 NFHA member organizations, the U.S. Department of Housing and Urban Development (HUD)'s 10 regional offices, and 77 state and local government agencies in HUD’s FHAP program. Information also was obtained from the U.S. Department of Justice (DOJ).

Most of the millions of housing discrimination incidents each year go unreported because they are difficult to identify or document. All complaints also are not made because individuals might fear facing retaliation or eviction if they file a complaint. Therefore, the total number should be considered an undercount.

Private nonprofit fair housing organizations (FHOs) processed 75.5% of complaints, a 5.6% increase from 2022. These FHOs investigate fair housing complaints, collect data, provide fair housing counseling and education to consumers, and help clients file complaints. Fair Housing Assistance Program (FHAP) agencies processed 19.2% of complaints, HUD 5.1% of complaints, and the DOJ  0.1% of complaints. 

As in the previous year, discrimination based on disability accounted for the majority (52.6%) of complaints filed with FHOs, HUD, and FHAP agencies. There were 1,521 complaints of harassment reported, an increase of 66.2%. This is the highest number of harassment complaints reported since NFHA began reporting harassment-specific data in 2006.

Read the July 10, 2024 NFHA article.

Friday, September 29, 2023

Fed Regulator Warns AI & Machine Learning Can Worsen Lending Bias

AI & Automated Decisions Determine Credit Rating, Loan Terms, Hiring, Housing, etc. 

“While these technologies have enormous potential, they also carry risks of violating fair lending laws and perpetuating the very disparities that they have the potential to address,” the Fed’s vice chair of supervision, Michael Barr, said at the recent National Fair Housing Alliance (NFHA) 2023 national conference. While new artificial intelligence tools could cheaply expand credit to more people, machine learning and AI may also worsen bias or inaccuracies inherent in data used to train the systems or make inaccurate predictions.

As concerns grow over increasingly powerful artificial intelligence systems like ChatGPT, the nation’s financial watchdog says it’s working to ensure that companies follow the law when they’re using AI. Automated systems and algorithms help determine credit ratings, loan terms, bank account fees, and other aspects of our financial lives. AI also affects hiring, housing and working conditions.

In the past year, the Consumer Finance Protection Bureau said it has fined banks over mismanaged automated systems that resulted in wrongful home foreclosures, car repossessions, and lost benefit payments, after the institutions relied on new technology and faulty algorithms.

One problem is transparency. Under the Fair Credit Reporting Act and Equal Credit Opportunity Act, for example, financial providers legally must explain any adverse credit decision. Those regulations likewise apply to decisions made about housing and employment. Where AI make decisions in ways that are too opaque to explain, regulators say the algorithms shouldn’t be used. “I think there was a sense that, ‘Oh, let’s just give it to the robots and there will be no more discrimination,’” Chopra said. “I think the learning is that that actually isn’t true at all. In some ways the bias is built into the data.”

EEOC Chair Charlotte Burrows said there will be enforcement against AI hiring technology that screens out job applicants with disabilities, for example, as well as so-called “bossware” that illegally surveils workers. The Fed recently announced two policy initiatives to address appraisal discrimination in mortgage transactions. Under the proposed rule, institutions that engage in certain credit decisions would be required to adopt policies, practices, and control systems that guarantee a “high level of confidence” in automated estimates and protect against data manipulation.

In April, 2023, about one-fourth of federal agencies, including the Federal Trade Commission and the Department of Justice, announced their commitment to cracking down on automated systems that cause harmful business practices.

Sam Altman, the head of OpenAI, which makes ChatGPT, said government intervention “will be critical to mitigate the risks of increasingly powerful” AI systems, suggesting the formation of a U.S. or global agency to license and regulate the technology. The Electronic Privacy Information Center said the agencies should do more to study and publish information on the relevant AI markets, how the industry is working, who the biggest players are, and how the information collected is being used - like regulators have done with previous new consumer finance products and technologies.

*****

Read the July 19, 2023 The Hill article.

Read the June 15, 2023 Federal Times article.