The latest fire at Facebook has been lit by Russia, which allegedly bought ads to destabilize the US, and the fallout might force Facebook to change the way it does business.
“If Facebook really wants to respond to the public outcry, it’s going to have to leave money on the table,” said David Carroll, a professor of media design at The New School and an outspoken critic of Facebook and the part it played during and after the 2016 election.
Criticism over Facebook’s role in distributing politically charged fake news, or “false news” as Facebook likes to call it, has been rampant since Trump’s win in November. Two weeks ago Facebook revealed that an internal investigation had identified $100,000 in ad spend, amounting to around 3,000 ads, coming from fake accounts and pages “likely operated out of Russia.”
False news and fake news are different beasts – the former perpetrated by foreign state actors and the latter either by ideologues or opportunists looking to make a buck off of incendiary clickbait.
But because Facebook has been the vehicle for both, the narratives are converging. And the connective tissue is the question of how much control Facebook has over how its systems are used or misused.
Although it does rely on a sizable team of human moderators to review ads and content, “Facebook or any of the really large marketing platforms don’t have real humans overlooking every single ad buy,” said Mark Jablonowski, CTO and partner at liberal ad tech outfit DSPolitical.
The anti-Semitic targeting debacle is a case in point. A recent exposé in ProPublica revealed a slew of hate-charged ad targeting categories, including “Jew hater” and “how to burn Jews.” After the report, Facebook suspended the ability to use self-reported targeting fields in its ad system. [AdExchanger coverage]
“One of the radical things Facebook has done is to take the interfaces and dashboards that only people in ad tech ops used to look at and make them available to anyone with a credit card,” Carroll said. “And now we’ve seen the effects of putting industrial-strength ad targeting tools into the hands of ordinary people and even foreign state adversaries.”
But that’s not to say Facebook puts out the welcome mat for anything and everything. Facebook’s ad quality team, which is now headed by ad tech vet Rob Leathern, is constantly vetting content in an never-ending game of cat and mouse.
“That’s why you’re not seeing nudity or iPad fill-out-this poll scams like you used to, and why people under 21 or people in Saudi Arabia don’t see ads for alcohol,” said former Facebook exec and “Chaos Monkeys” author Antonio Garcia Martinez, who led the team that built Facebook’s ad exchange and also helmed the ad quality crew in 2012, right around the time of the second Obama election.
“This content is tagged using machine learning and goes to a special workflow,” Garcia Martinez said. “There’s no reason Facebook couldn’t do this with political content as well.”
Playing (With) Politics
Facebook has long demurred that it’s a platform rather than a publisher. But current events are pushing Facebook to take more responsibility for the news and ad content it distributes, as well as to be more proactive in finding out who’s making money off the content or paying for ads.
“We basically do not regulate them,” said Chris Hoofnagle, a privacy, technology and law professor at the University of California, Berkeley. “Facebook does whatever it wants and then justifies its actions post hoc with paeans to ‘community’ and ‘openness.’”
That could change.
In traditional media, political advertisers must follow stringent rules related to disclosures and not accepting funds, either directly or indirectly, from any foreign national.
When you hear, “I’m candidate X and I support this message” at the end of a TV spot or radio ad, that’s a legally mandated disclosure required by the Federal Election Campaign Act, which regulates the financing of political campaigns in the United States.
The Federal Election Commission keeps tabs on all traditional political media ad buys at the national, state and local level to make sure no foreign entity is trying to influence a US election. But the commission doesn’t collect data on digital ad buys.
That means that political ads on Facebook, or any platform that enables digital advertising, aren’t required to include disclosures. In other words, political digital advertising is regulated differently from all other political advertising.
But regulators and policymakers are waking up. FEC Vice Chairwoman Caroline Hunter publicly stated that the Russian-sponsored ads on Facebook could trigger an enforcement action, although passing new regulations to address online campaigns is not on the agenda, at least not yet.
In the meantime, Facebook is cooperating with special counsel Robert Mueller’s probe into the Trump campaign’s ties to the Kremlin. After Mueller’s office secured a search warrant, Facebook turned over copies of the ad creatives and details about the targeting criteria it had uncovered.
If Mueller successfully proves collusion and reveals that buying political ads online was part of the alleged conspiracy, Facebook could be found guilty of a federal crime even if no one from Facebook attended any clandestine meetings at Trump Tower.
On Tuesday, the Senate Intelligence Committee Chairman Richard Burr (R-NC) said that Facebook will be called to testify before the group soon, although the exact agenda items are TBD.
Only The Tip Of The Iceberg?
But what does a $100,000 ad buy really get Facebook? In the grand scheme of its revenue – the platform posted $9.32 billion in revenue during its second-quarter earnings in July, $8 billion of which was just mobile ad revenue – it’s an infinitesimal drop in the bucket.
“It’s just peanuts in the grand scheme of things and there’s no way it could possibly swing an election,” Garcia Martinez said. “But it could always be worse next time: $100,000 today, $100 million tomorrow.”
While that’s still a hypothetical scenario, DSPolitical’s Jablonowski has the uneasy feeling that the Russian ads found through Facebook’s internal investigation are just the beginning.
“Spending $100,000 over a couple of years on a little over 3,000 ad variants – that sounds like a test to me, testing messages to figure out where to go and spend the rest of the money,” he said. “You can get very deep insights on how people react to specific messages on Facebook and what messaging, for example, resonates with voters in America.”
There’s nothing stopping a bad actor from taking the insights gathered from Facebook campaigns and bringing them into the programmatic world to target potential persuadable voters in particular districts.
“There’s no evidence to support this yet,” Jablonowski said. “But the fact is that programmatic gives little transparency and visibility into ad buys and it would be very difficult to track the spend of foreign actors doing that form of audience targeting.”
What Might Happen Next?
Although the FEC is reticent to pass new regulations, the issue of whether political ads on Facebook should be regulated (or self-regulated) like they are everywhere else is still very relevant.
It wouldn’t be all that difficult to create consistency between paid political messaging on Facebook and paid political messaging in other channels, said Garcia Martinez.
For one, Facebook would need to know exactly who its customers are and where the money is coming from, which the political ad sales team at Facebook could accomplish with an additional step in its operational workflow. Including a reference to who paid for an ad within a dropdown UI would take care of disclosure.
With campaigns, as opposed to the ideologically motivated spread of fake news, there’s a money trail, said independent fraud researcher and cybersecurity pundit Augustine Fou.
“And that type of information can be easily subpoenaed if you want to know who paid for a campaign,” Fou said.
But nefarious actors can hide their real identities without much effort, even on a people-based platform like Facebook – and Facebook can only know what it knows. Without more controls in place, even a subpoena can only order Facebook turn over or testify about what’s it got.
“You may know what credit card paid for it, but you don’t know who is depositing money into the account,” Jablonowski said. “It’s easy for anyone to set up a shell corporation to funnel money through in order to place ad buys, not even talking about how easy it would be for state-sponsored organizations to do it.”
Fake News Travels Fast
But the more potent issue is the battle for hearts and minds, and that’s being fought with fake news, not advertising.
“We only wish advertising was that influential,” said Mike Zaneis, president and CEO of the Trustworthy Accountability Group.
And trying to regulate fake news is another story altogether. Classifying what exactly constitutes misinformation is a slippery slope that very quickly becomes a debate on free speech and censorship.
Aside from the obvious and flagrantly baseless hoaxes like the Pizzagate conspiracy theory, the “definition of fake news is so often in the eye of the beholder,” Zaneis said.
“When one person thinks Breitbart is fake news and someone else thinks The Huffington Post is fake news, that makes it extremely difficult to regulate or enforce from a government perspective,” he said. “The reality is that what influences people is their personal sphere of influence, and that comes from the echo chambers people create and the content they share when they’re on the platform.”
This post was syndicated from Ad Exchanger.
More Stories
By the Book: How ‘The Fugitive’ Director and an Investigative Journalist Collaborated on 2024’s Timeliest Thriller
The Best Holiday Ads of 2024
The Year in Ratings: How the Major News Outlets Performed in 2024