November 2, 2024

Programmatic

In a world where nearly everyone is always online, there is no offline.

Facebook Eliminates 5,000 Ad Targeting Options To Pull The Plug On Prejudice

<p>Advertisers that want to exclude people interested in “Passover,” “Native American culture” or “evangelism” from seeing a campaign on Facebook will soon be out of luck. On Tuesday, Facebook said it’s planning to remove more than 5,000 ad targeting parameters that could be used to discriminate against minority groups. The targeting options will be unavailable<span class="more-link">... <span>Continue reading</span> »</span></p> <p>The post <a rel="nofollow" href="https://adexchanger.com/platforms/facebook-eliminates-5000-ad-targeting-options-to-pull-the-plug-on-prejudice/">Facebook Eliminates 5,000 Ad Targeting Options To Pull The Plug On Prejudice</a> appeared first on <a rel="nofollow" href="https://adexchanger.com">AdExchanger</a>.</p><img src="http://feeds.feedburner.com/~r/ad-exchange-news/~4/Ckb0OpJo2Vc" height="1" width="1" alt="" />

Advertisers that want to exclude people interested in “Passover,” “Native American culture” or “evangelism” from seeing a campaign on Facebook will soon be out of luck.

On Tuesday, Facebook said it’s planning to remove more than 5,000 ad targeting parameters that could be used to discriminate against minority groups.

The targeting options will be unavailable for new campaigns starting Sept. 4. For campaigns already in flight, access will end Oct. 1.

In the coming weeks, if advertisers want to keep advertising on Facebook, they’ll also be required to certify within Ads Manager that they comply with Facebook’s anti-discrimination policy. The certification tool will roll out to US users first with other countries to follow.

The cynical (or perhaps realist) point of view is that getting rid of these targeting options serves a double purpose for Facebook by keeping its detractors at bay while nudging advertisers toward Custom Audiences.

Facebook has a habit of using the news cycle as air cover to kill multiple birds with one stone.

When Facebook said in March that it would stop allowing third-party data providers from offering their data for targeting directly on its platform, for example, it was ostensibly in direct reaction to the Cambridge Analytica scandal, which had broken a week before. But removing that functionality could be seen as another way to encourage use of Custom Audiences.

But, some clients just don’t have the customer data they’d need to feed into Custom Audiences or don’t feel comfortable using that data on social platforms, said Jeanne Bright, VP of social activation for North America at Essence. In that case, they either have to rely on Facebook targeting or find other places to advertise.

“In general, Facebook targeting is degraded from where it was six months ago, and that’s causing some clients to think a little harder about their data strategy, the sources of their data, how to use it on social and whether to potentially shift to other channels,” she said.

Facebook doesn’t disclose how many targeting options it offers, but a spokesperson told AdExchanger that removing 5,000 won’t meaningfully affect the overall number. Facebook won’t share a complete list of all the parameters so as not to tip off bad actors on what to avoid.

But if it’s possible to remove more than 5,000 targeting options without causing a dent, maybe these parameters weren’t even that useful to start.

“There’s a ton of targeting available on the platform that just makes no sense,” said Bright, who had a healthcare client once that was looking to target against an interest in diabetes. “Of the four targeting options that came up, one was ‘diabetes in cats,’ but no regular diabetes in humans. We’re continually surfacing questionable targeting options using Facebook data.”

Most of the parameters on the cutting block fall into the exclusion targeting bucket, which allow advertisers to quarantine audiences of people they want to keep from seeing their ads.

A benign use case for this type of targeting could be a sports team looking to expand its base by excluding existing fans.

But the Fair Housing Act, for example, makes it illegal to discriminate on the sale, rental or financing of housing based on religion, race, national origin or gender – which exclusion targeting makes it quite easy to do.

Facebook isn’t removing the ability to do multicultural targeting, which is an important part of the data strategy for many clients, Bright said. Although advertisers won’t be able to exclude racial groups from a campaign, for example, they will be able to target groups based on parameters such as affinity or spoken language.

But Facebook has been under a lot fire for enabling unfortunate targeting capabilities. The Department of Housing and Urban Development filed a complaint against Facebook on Friday for engaging in housing discrimination by allowing landlords or mortgage companies to limit who can see their ads based on factors such as religion, age or sex.

Although Facebook’s targeting purge comes on the heels of the HUD complaint, Facebook first started limiting exclusion targeting in April. At that point, Facebook had already removed thousands of ad-targeting categories mostly focused on race, ethnicity, sexual orientation and religion.

Facebook has tried to shore up its systems since a 2016 ProPublica report exposed how Facebook’s targeting tools could be used to exclude black, Hispanic and Asian Americans from seeing housing ads.

The following year, ProPublica also called out Facebook for its racist and anti-Semitic targeting categories and for enabling ageist recruitment ads.

This post was syndicated from Ad Exchanger.