April 23, 2024

Programmatic

In a world where nearly everyone is always online, there is no offline.

As Antitrust Legislation Is Introduced, How Should We Calculate The Value Of Data?

<p>"Data-Driven Thinking" is written by members of the media community and contains fresh ideas on the digital revolution in media. Today’s column is written by Ka Mo Lau, chief operating officer at Thunder Experience Cloud. Since the 1970s, a decade when the latest technology was the Atari game console and Commodore computer, the standard for<span class="more-link">... <span>Continue reading</span> »</span></p> <p>The post <a rel="nofollow" href="https://adexchanger.com/data-driven-thinking/as-antitrust-legislation-is-introduced-how-should-we-calculate-the-value-of-data/">As Antitrust Legislation Is Introduced, How Should We Calculate The Value Of Data?</a> appeared first on <a rel="nofollow" href="https://adexchanger.com">AdExchanger</a>.</p><img src="http://feeds.feedburner.com/~r/ad-exchange-news/~4/NPPTIk5-7Lk" height="1" width="1" alt="" />

Data-Driven Thinking” is written by members of the media community and contains fresh ideas on the digital revolution in media.

Today’s column is written by Ka Mo Lau, chief operating officer at Thunder Experience Cloud.

Since the 1970s, a decade when the latest technology was the Atari game console and Commodore computer, the standard for antitrust regulation has been consumer welfare – essentially, the price consumers pay.

Acquisitions or other corporate behavior that increased prices would sound the alarm for regulators. But, if these activities lowered prices, they would proceed with government blessings.

This test even superseded other defining features of a monopoly, such as market share – a 25% to 51% market share requirement in Western countries is defined as a monopoly – in determining whether antitrust action be taken. With growing ire against “Big Tech,” politicians and regulators have begun questioning if the consumer welfare standard can keep up with the modern times.

As advertising industry leaders like Brian O’Kelley have noted, this math test fails with with free, ad-supported technology services like the ones Facebook and Google provide. O’Kelley has written, “This is the loophole that allowed Google and Facebook to complete hundreds of acquisitions over the past decade without any significant FTC review. Let’s apply some common sense to the regulatory process just by acknowledging that consumers pay for ad-supported content with their data and their attention.”

Well, the newest US legislation proposed, known as the DASHBOARD Act, can start the process to quantify just how much consumers are paying brands – not in dollars, but in data. This bill calls for the SEC to develop a method to calculate the value of user data and for companies with vast user profile databases to then quantify their value – or, put another way, to quantify the price to the consumer for access to these massively popular free services.

Methods for calculation and enforcement could vary widely. The most transparent but highest overstatement would be to use average revenue per user (ARPU), since it would assign all the value to data but no other factors, such as context and inventory. The least transparent but most understated method would be an internal calculation of what the tech platform would pay for the data in its own estimation. The middle ground might involve an open market, where if the data were sellable, price discovery would occur for the value and cost of the data.

How a firm values the data vastly affects whether it would be a target for antitrust enforcement. Under an ARPU standard, firms with the best monetization would likely appear to capture most of the market. It gets worse for them when you actually think of average profit per user, since many of these firms literally pay users nothing for their data, while other companies in data-driven advertising may be paying a third party for that data or asking the user to fill out surveys to collect that same data. Under an internal calculation, if large players lowball the value of data, they can keep their estimated market share below 25-51% and escape scrutiny.

One problem with costing out privacy using data’s value is that it may not be zero-sum. One firm having data on a user doesn’t preclude another firm from having data on the same user. In addition, firms can also monetize the data at different rates based on a lot of factors. So taking data’s value is a good start, but this approach could be incomplete or challenged more easily.

One solution would be to go with “attention,” where a person has a finite amount of time in a day. Therefore, market share of attention for digital media may be far easier to measure, and, in fact, Nielsen and Comscore have been doing just that for years. Pitfalls of this approach would be determining which activities would count, such as ecommerce shopping, and the fact that some highly valuable activities like search advertising take seconds.

Regardless of which approach or combination is taken, there is now a real opportunity to use a new lens of pricing information with an accepted and long established antitrust approach. Despite calls for new antitrust rules tailored for modern tech companies, regulators can keep a consistent consumer welfare approach to weighing antitrust action while looking at the true costs – even if they aren’t an exchange of money, but rather an exchange of value between consumer and business.

Follow Thunder (@MakeThunder) and AdExchanger (@adexchanger) on Twitter.

This post was syndicated from Ad Exchanger.