Join GamesBeat Summit 2021 this April 28-29. Register for a free or VIP pass today.
Following months of heavy scrutiny into its ad targeting practices, Facebook announced today that it is getting rid of more than 5,000 ad targeting options.
In a blog post announcing the changes, Facebook did not say specifically which terms would be removed, but said that it’s “limiting the ability for advertisers to exclude audiences that relate to attributes such as ethnicity or religion.” BuzzFeed News first reported on the announcement.
Facebook’s changes to its ad targeting practices come just a week after the U.S. Department of Housing and Urban Development filed a housing discrimination complaint against Facebook. A quick read through the complaint shows the myriad of ways in which landlords could run discriminatory ads on Facebook. Advertisers could essentially exclude users of a certain ethnicity from seeing an ad by choosing to exclude users who Facebook categorized as being interested in “Latin America” or “Somalia.” They could also exclude users who were interested in “mobility scooters” or “child care” to prevent disabled renters or working parents from seeing their ads.
“When Facebook uses the vast amount of personal data it collects to help advertisers to discriminate, it’s the same as slamming the door in someone’s face,” Anna Maria Farías, HUD’s Assistant Secretary for Fair Housing and Equal Opportunity, said in a statement.
Facebook also announced today that it would be requiring advertisers that run housing, employment, or credit ads to review a post outlining Facebook’s non-discrimination policies, and affirm that it accepts those policies in order to continue running ads. A Facebook spokesperson said that the company has been working on the changes announced today for a while, and that the announcement was not prompted by HUD’s complaint.
In a high-profile October 2016 piece titled “Facebook lets advertisers exclude users by race,” news outlet ProPublica purchased its own ads on Facebook to show how U.S. landlords could use Facebook’s targeting options to discriminate against certain users, in violation of the Fair Housing Act of 1968. At the time, Facebook said that its policies prevented advertisers from using the targeting options for discriminatory practices, but for the most part, it was trusting advertisers to properly utilize the ad targeting options.
A few months after the story, Facebook said that it was developing stronger machine learning tools to more proactively spot discriminatory ads. However, a follow-up investigation from Pro Publica revealed that these new tools did not stop dozens of discriminatory ads from being approved.
This isn’t the first time Facebook has made changes to who advertisers can target — in March, the company stopped letting advertisers target users based on sexual orientation. The move was criticized by some LGBT nonprofits, who said that it made it harder for them to reach their intended audience.
In today’s announcement, Facebook said that while the ad targeting options it has removed have previously been “used in legitimate ways to reach people interested in a certain product or service,” it acknowledged that “we think minimizing the risk of abuse is more important.”
Facebook’s ad targeting practices also received a fresh round of scrutiny following the U.S. presidential election when it was revealed that the Russian-linked Internet Research Agency used ads to discourage minorities from voting.
VentureBeatVentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact. Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:
- up-to-date information on the subjects of interest to you
- our newsletters
- gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
- networking features, and more