» Subscribe Today!
The Power of Information
Home
The Ledger - EST. 1978 - Nashville Edition
X
Skip Navigation LinksHome > Article
VOL. 46 | NO. 24 | Friday, June 17, 2022

Facebook and US sign deal to end discriminatory housing ads

Print | Front Page | Email this story

NEW YORK (AP) — Facebook will change its algorithms to prevent discriminatory housing advertising and its parent company will subject itself to court oversight to settle a lawsuit brought by the U.S. Department of Justice on Tuesday.

In a release, U.S. government officials said it had reached agreement with Meta Platforms Inc., formerly known as Facebook Inc., to settle the lawsuit filed simultaneously in Manhattan federal court.

According to the release, it was the Justice Department's first case challenging algorithmic discrimination under the Fair Housing Act. Facebook will now be subject to Justice Department approval and court oversight for its ad targeting and delivery system.

U.S. Attorney Damian Williams called the lawsuit "groundbreaking." Assistant Attorney General Kristen Clarke called it "historic."

Ashley Settle, a Facebook spokesperson, said in an email that the company was "building a novel machine learning method without our ads system that will change the way housing ads are delivered to people residing in the U.S. across different demographic groups."

She said the company would extend its new method for ads related to employment and credit in the U.S.

"We are excited to pioneer this effort," Settle added in an email.

Williams said Facebook's technology has in the past violated the Fair Housing Act online "just as when companies engage in discriminatory advertising using more traditional advertising methods."

Clarke said "companies like Meta have a responsibility to ensure their algorithmic tools are not used in a discriminatory manner."

According to terms of the settlement, Facebook will stop using an advertising tool for housing ads that the government said employed a discriminatory algorithm to locate users who "look like" other users based on characteristics protected by the Fair Housing Act, the Justice Department said. By Dec. 31, Facebook must stop using the tool once called "Lookalike Audience," which relies on an algorithm that the U.S. said discriminates on the basis of race, sex and other characteristics.

Facebook also will develop a new system over the next half-year to address racial and other disparities caused by its use of personalization algorithms in its delivery system for housing ads, it said.

If the new system is inadequate, the settlement agreement can be terminated, the Justice Department said. Per the settlement, Meta also must pay a penalty of just over $115,000.

The announcement comes after Facebook already agreed in March 2019 to overhaul its ad-targeting systems to prevent discrimination in housing, credit and employment ads as part of a legal settlement with a group including the American Civil Liberties Union, the National Fair Housing Alliance and others.

The changes announced then were designed so that advertisers who wanted to run housing, employment or credit ads would no longer be allowed to target people by age, gender or zip code.

The Justice Department said Tuesday that the 2019 settlement reduced the potentially discriminatory targeting options available to advertisers but failed to resolve other problems, including Facebook's discriminatory delivery of housing ads through machine-learning algorithms.

Follow us on Facebook, Twitter & RSS:
Sign-Up For Our FREE email edition
Get the news first with our free weekly email
Name
Email
TNLedger.com Knoxville Editon
RECORD TOTALS DAY WEEK YEAR
PROPERTY SALES 0 0 0
MORTGAGES 0 0 0
FORECLOSURE NOTICES 0 0 0
BUILDING PERMITS 0 0 0
BANKRUPTCIES 0 0 0
BUSINESS LICENSES 0 0 0
UTILITY CONNECTIONS 0 0 0
MARRIAGE LICENSES 0 0 0