Facebook settles with DOJ over discriminatory housing ads

Placeholder when loading item promotions

Facebook owner Meta agreed to overhaul the social network’s targeted advertising system as part of a sweeping settlement with the US Department of Justice after the company was accused of allowing landlords to market their housing ads in a discriminatory manner.

The settlement, which follows from a lawsuit filed by the Trump administration under the Fair Housing Act of 2019, is the second such settlement in which the company has agreed to change its ad systems to prevent discrimination. But Tuesday’s agreement goes further than the first, prompting Facebook to overhaul its powerful internal ad targeting tool known as Lookalike Audiences. Government officials said by allowing advertisers to target housing-related ads based on race, sex, religion or other sensitive characteristics, the product enabled discrimination in housing searches.

As part of the settlement, Facebook will build a new automated advertising system that the company says will help deliver housing-related ads to a more equitable mix of the population. The settlement said the social media giant would have to take the system to a third party for verification. Facebook, which renamed its parent company Meta last year, also agreed to pay a fee of $115,054, the maximum fine allowed by law.

“This settlement is historic and marks the first time that Meta has agreed to terminate one of its algorithmic targeting tools and change its housing ad delivery algorithms in response to a civil rights lawsuit,” said Assistant Attorney General Kristen Clarke of the Justice Department’s Civil Court Rights Division.

According to Facebook spokesman Joe Osborne, advertisers can still target their ads to users in specific locations, but not solely by their zip codes and users with narrow interests.

Facebook is now required by law to prevent advertisers from excluding people based on their race

Roy Austin, Facebook’s vice president of civil rights, said in a statement that the company will use machine learning technology to try to more fairly distribute who sees housing-related ads, regardless of how those marketers targeted their ads by using Age, gender and probability take into account race of users.

“Discrimination in housing, employment and credit is an entrenched issue with a long history in the United States, and we are committed to expanding opportunities for marginalized communities in these and other areas,” Austin said in a statement. “This type of work is unprecedented in the advertising industry and represents a significant technological advancement for using machine learning to deliver personalized ads.”

Federal law prohibits residential discrimination based on race, religion, national origin, sex, disability, or marital status.

The agreement follows a series of legal complaints by the Justice Department, an attorney general and civil rights groups against Facebook, which argue that the company’s algorithm-based marketing tools specialize in giving advertisers a unique ability to target ads to thin slices of Facebook’s population — discriminated against minorities and other vulnerable groups in the areas of housing, credit and employment.

In 2019, Facebook agreed to no longer allow advertisers to use gender, age, and zip codes — which often serve as proxies for race — to market housing, loans, and job opportunities to its users. This change comes after an investigation by the Washington State Attorney General and a ProPublica report, which found that Facebook allowed advertisers to use its microtargeting ads to hide housing ads from African-American users and other minorities. After that, Facebook said it would no longer allow advertisers to use the “ethnicity” category for housing, loan and job ads.

HUD is investigating Twitter and Google’s advertising practices as part of an investigation into housing discrimination

But since the company agreed to those comparisons, researchers have found that Facebook’s systems could further encourage discrimination even if advertisers are banned from ticking certain boxes for gender, race or age. In some cases, the software recognizes that people of a certain race or gender click a certain ad frequently, and then the software begins to reinforce those biases by serving ads to “like-for-like audiences,” Peter Romer-Friedman said. a Principal at the law firm of Gupta Wessler PLLC.

The result could be that only men are shown a particular apartment ad, even if the advertiser hasn’t specifically attempted to show the ad only to men, said Romer-Friedman, who has filed multiple civil rights lawsuits against the company, including the 2018 settlement of those the company has agreed to limit ad targeting categories.

Romer-Friedman said the settlement was a “great achievement” as it marked the first time a platform was willing to make major changes to its algorithms in response to a civil rights lawsuit.

For years, Facebook has faced complaints from civil rights activists and people of color who argue that Facebook’s enforcement sometimes unfairly removes content where people complained about discrimination. In 2020, the company underwent an independent civil rights audit, which found that the company’s policies constitute a “huge setback” to civil rights.

Comments are closed.