November 2, 2024

Programmatic

In a world where nearly everyone is always online, there is no offline.

Facebook Can’t Control Unauthorized Data Sharing

<p>The Cambridge Analytica debacle demonstrated that Facebook has no systematic way of knowing what happens to data once it leaves the platform. What happened wasn’t a data breach – but that isn’t what matters. “Partners are bound by agreements that say they’re not supposed to share the data out, but there’s no way to regulate it,<span class="more-link">... <span>Continue reading</span> »</span></p> <p>The post <a rel="nofollow" href="https://adexchanger.com/data-exchanges/facebook-cant-control-unauthorized-data-sharing/">Facebook Can’t Control Unauthorized Data Sharing</a> appeared first on <a rel="nofollow" href="https://adexchanger.com">AdExchanger</a>.</p><img src="http://feeds.feedburner.com/~r/ad-exchange-news/~4/L9DcdTylNnw" height="1" width="1" alt="" />

The Cambridge Analytica debacle demonstrated that Facebook has no systematic way of knowing what happens to data once it leaves the platform.

What happened wasn’t a data breach – but that isn’t what matters.

“Partners are bound by agreements that say they’re not supposed to share the data out, but there’s no way to regulate it, and it’s probably happening every day,” one mobile ad exec told AdExchanger.

While Cambridge Analytica is a poster child for this, how many other Cambridge Analyticas are out there? The answer is probably a heck of a lot.

The sharing isn’t necessarily nefarious.

“I can’t tell you how many times I’ve worked with data scientists that are brilliant, way smarter than me, but many haven’t caught up to how ad technology works,” said John Lockmer, director of programmatic and ad ops at DuMont Project, a programmatic consultancy. “There’s a lot of compartmentalization in our industry.”

Before 2014, Facebook’s API let developers collect friend data by default – a practice that has been discontinued. That’s why UK researcher Aleksandr Kogan was able to gather up 50 million Facebook profiles despite his app only having 270,000 users. Kogan eventually sold that data to Cambridge Analytica.

Today, apps that want to request detailed user info go through a review process with Facebook in which developers are required to justify what they want to collect and why.

After the lid blew off the Cambridge Analytica story, Facebook hired a forensic auditor in the UK to investigate the company’s servers in London. The auditor was asked to leave the premises on Monday by Britain’s Information Commissioner’s Office, which is pursuing its own warrant to investigate Cambridge Analytica’s systems.

When AdExchanger asked Facebook if it has plans to audit other third parties it had previously told to delete data to make sure they actually did, a company rep pointed to a blog post by Paul Grewal, Facebook’s VP and deputy general counsel, which said it has a “variety of manual and automated checks” to ensure compliance with its policies, including random audits of existing apps and “regular and proactive monitoring of the fastest-growing apps.”

But one ad executive called it “enforcement theater.” When this person’s company was asked to delete data, the request came orally, rather than in writing, and no one from Facebook requested a look inside the company’s database. The company says it did destroy the data, but there was no follow-up and Facebook never asked for proof.

“We deleted all of it, but there was no audit beyond that,” the exec told AdExchanger. “We could easily have just not deleted it.”

In Kogan’s case, he had permission to collect Facebook data, just not to resell it or share it. But former Cambridge Analytica contractor Christopher Wylie told The Observer that when Facebook’s security protocols were triggered, because Kogan was pulling a large amount of data in a short period of time – millions of profiles over just a few weeks – “apparently Kogan told them it was for academic use so, they were like, ‘Fine.’”

But, as clearly happened with Cambridge Analytica, Facebook data does make its way into the commercial sphere.

Bryant Garvin, director of YouTube, search and display advertising at Purple, has also been on the receiving end of shady emails from obscure companies with claims of some sort of fancy, proprietary data collection technique.

“It happens every couple of months,” Garvin said. “Someone sends an email from a company I’ve never heard of that purports to have personalized targeting options, and they’re never clear on the science behind it or how they’re getting the data. It’s always a major red flag for me.”

And a CEO of a small agency told AdExchanger that it’s common to get emails from people, sometimes with ties to academia, offering Facebook data or device IDs for sale.

But rather than a thriving black market for Facebook data fueled by malevolent intent, the more likely issue is willful ignorance. A case of “data suppliers promising lots of deep data without being forthcoming about the source, and data buyers determined to not look that closely,” said Beth Morgan, COO at mobile data company Twine.

“The terms of service say that publishers can’t share the data they get through Facebook,” Morgan said. “So, the problem lies in a) ignorance and b) difficulty in auditing/checking. Basically, the data industry operates largely on trust, because it’s relatively hard to track data flows and see where it’s going.”

And this isn’t Facebook’s problem alone. Tracking the provenance of data and where it goes is a major frustration for anyone with proprietary data operating in the digital ecosystem.

“If you integrate with most data vendors, they commingle the data,” said Keith Petri, chief strategy officer at Screen6. “And, especially if you have direct-to-publisher relationships with access to proprietary data, those publishers don’t want their users to be commingled, mixed and profiled by other platforms.”

This post was syndicated from Ad Exchanger.