Meta Settles With Justice On Discriminatory Housing Ads
The historic complaint marks the first time federal prosecutors challenged a company over algorithmic bias related to housing discrimination.
Technology behemoth Meta entered into a settlement agreement with the U.S. Department of Justice following allegations of discriminatory housing advertisements on the social media platform formerly known as Facebook.
Federal prosecutors filed a complaint against Meta accusing them of using targeted home advertisements using demographic data protected by the Fair Housing Act in the company’s machine learning algorithms, thereby violating housing law.
The complaint specifically accuses Meta of pairing users with certain housing ads based on their “Lookalike Audience” tool, which uses FHA-protected characteristics in its algorithm. These algorithms then determined which users would receive housing advertisements and deem which online users were included in its “Eligible Audience.”
This is the first case filed by the Justice Department alleging algorithmic bias under the FHA.
“As technology rapidly evolves, companies like Meta have a responsibility to ensure their algorithmic tools are not used in a discriminatory manner,” Assistant Attorney General Kristen Clarke of the Justice Department’s Civil Rights Division said in a statement. “This settlement is historic, marking the first time that Meta has agreed to terminate one of its algorithmic targeting tools and modify its delivery algorithms for housing ads in response to a civil rights lawsuit.”
Some of the data allegedly gathered and used in a targeted ad campaign included race, religion, sex, disability status, familial and marital status, and national origin.
Meta’s advertisement delivery system allegedly utilized these sensitive characteristics to determine which users were eligible—and were not—to see certain ads related to finding a home.
While the Justice Department complained of discrimination based on this advertisiing practice, the agency agreed to let Meta adjust its ad delivery system to include all users in fair housing practices, as pursuant to the FHA. Should Meta fail to comply, the federal prosecutors will follow through with litigation.
“When a company develops and deploys technology that deprives users of housing opportunities based in whole or in part on protected characteristics, it has violated the Fair Housing Act, just as when companies engage in discriminatory advertising using more traditional advertising methods,” said Damian Williams, U.S. Attorney for the Southern District of New York, the location where the complaint was filed. “Because of this ground-breaking lawsuit, Meta will—for the first time—change its ad delivery system to address algorithmic discrimination.
Meta promptly responded to the settlement agreement, saying that the company will introduce a new method designed to ensure ads are fairly distributed.
“We are building into our ads system a method—referred to in the settlement as the ‘variance reduction system’—designed to make sure the audience that ends up seeing a housing ad more closely reflects the eligible targeted audience for that ad,” the company said in a statement.
Meta said that it consulted with advocacy and civil rights organizations to reevaluate its ad delivery systems. This includes retiring its Special Ad Audiences, a tool for advertisers to expand audiences for ad campaigns related to housing opportunities. Some data, such as gender or age, is now prohibited from being used by advertising companies working with Meta.
Targeted advertisements have come under intense scrutiny in relation to social media companies and how their users’ data is collected and analyzed. Some legislators on Capitol Hill have sought to outright prohibit targeted advertising, notably with the Banning Surveillance Advertising Act of 2022.
Representative Anna Eshoo, D-Calif., one of the bill’s sponsors, noted that her bill would prevent the usage of personal data, such as that protected by the FHA, in online ad campaigns.
“The ‘surveillance advertising’ business model is premised on the unseemly collection and hoarding of personal data to enable ad targeting,” Eshoo said earlier. “This pernicious practice allows online platforms to chase user engagement at great cost to our society, and it fuels disinformation, discrimination, voter suppression, privacy abuses and so many other harms.”