Last week, the U.S. Department of Housing and Urban Development charged Facebook with violating civil rights laws that prohibit discriminatory housing ads.
The government's complaint included the allegation, which is familiar by now, that Facebook's targeting tool allows advertisers to block their ads from people based on their race, religion and gender.
But the complaint also included a more surprising allegation, centered on Facebook's ad delivery system. Specifically, the government alleged that the ad-delivery system discriminates based on characteristics like race and gender -- even when advertisers don't want to do so.
“Even if an advertiser tries to target an audience that broadly spans protected class groups, Respondent’s ad delivery system will not show the ad to a diverse audience if the system considers users with particular characteristics most likely to engage with the ad,” HUD says in its complaint.
Now, a new paper -- “Discrimination through optimization: How Facebook’s ad delivery can lead to skewed outcomes” -- by researchers at Northeastern University, the University of Southern California, and the advocacy group Upturn sheds more light on the allegations.
In essence, the researchers say Facebook's ad-delivery system sends ads to particular individuals based on whether the ads will be “relevant” to them. But that determination can turn on stereotypes.
For the report, researchers ran a variety of ads across Facebook, and then determined the likely gender and race of the people who were served the ads. (Facebook doesn't tell advertisers the likely race of users who were served ads. Instead, the company broke out results by designated-market-area region, which the researchers used as a proxy for race.)