“Data-Driven Thinking” is written by members of the media community and contains fresh ideas on the digital revolution in media.
Today’s column is written by Ramsey McGrory, chief revenue officer at Mediaocean.
In February 2013, I developed a simple data framework to demystify the data landscape and explain why technology and media companies were building and acquiring companies to bolster their ability to generate, ingest, normalize, transform and activate data at scale.
A lot has and hasn’t changed, so I wanted to revisit the piece. At the time, I identified the “Three V’s of Big Data” – volume, velocity and variety – and they are even more true in media today. The volume and variety of data continue to grow significantly, and faster and cheaper processing and cloud-based computing enable companies to do more with the data.
Despite all of these changes, the framework still holds. There are new types of data coming to market, but any new data or company generating data can be placed in this framework.
Here are other big observations about the evolution of the data landscape:
Veracity: The Fourth V
Since 2013, a fourth V has emerged in big data that is incredibly relevant in media: veracity. Many of the acquisitions done by major technology companies, such as Oracle, IBM, Salesforce and Google, aim to develop ‘”data truth,” but there is no “Moat for data.” Greg Herbst speaks to it here.
A key aspect of veracity is the “truth of the inference.” The industry often plays fast and loose here. Ted McConnell has spoken about data being neither intrinsically “good” nor “bad,” but rather having qualities.
For example, we infer an attitude when a data set of car site visitors becomes “auto intenders.” In some cases, the veracity of the inference is excellent. In many cases, it’s not, and we need better measurement systems that can determine how behaviors and demographics map to real attitudes, which we can then map to media delivery. Attitudes, rather than demographics and behaviors, drive actions.
Viewability is an important part of veracity, but I dismissed it initially. I heard the comScore call to arms on viewability in 2012 and shrugged, thinking every programmatic ad call got a separate bid and nonviewable inventory was bid-adjusted. I missed the real power of viewability data in ensuring trust between buyers and sellers. Viewability speaks to a broader metadata theme of trust, as well as an underlying theme of data quality and users’ engagement with content delivered against this data.
Beyond viewability, quality of engagement is another broad theme. Ten years ago, DoubleVerify and AdSafe identified when good ads appeared on bad sites. Then, Moat and Integral Ad Science identified when users saw the good ads. Now, White Ops and Amino tell us who is a real user and who is a real seller. We will continue to see innovation in this area.
With the number of devices, particularly mobile devices, location data has reached massive scale. While there are many applications in advertising, such as local targeting for a Tier 2 auto dealership or digital billboard views, some applications go well beyond advertising. For example, SafeGraph works with universities and health organizations to understand movement data and the spread of infectious diseases.
Also connected to the massive increase of devices, cross-device mapping has exploded. Advertisers and publishers alike need a holistic understanding of their engagement with users, and cross-device mapping is the necessary (but not sufficient) data they need. Drawbridge remains a leader in this space for targeting, and MediaWallah is a newer company focused on broader uses and more deterministic approaches.
Connected and addressable television are now at a scale where advanced or data-enabled television targeting are strategic elements of advertising campaigns. Beyond addressable television, agencies have made significant organizational changes to plan and execute integrated television and video initiatives. Connected devices and cross-device mapping are enabling attribution, more holistic planning and better targeting and frequency management.
The Fight For Anonymity
The anonymous/digital data quadrant is under assault by ad and cookie blockers and the growth of mobile, where cookies can’t be used. While device IDs are persistent, there are growing trends toward people taking control of their anonymization through the use of virtual private networks and Tor. I wrote in April that “consumers will increasingly respond to this perceived attack on their privacy by taking a blanket approach that favors anonymity.”
Consumers’ desire for anonymity may become a demand or even a regulation, as more massive data breaches like Yahoo’s or Equifax’s reveal that hackers have access to millions of consumers’ sensitive information, including Social Security numbers, birthdates and credit card numbers. Data companies, such as Experian, Acxiom, TransUnion and Equifax, may have their entire business models upended as consumers demand greater control of privacy and anonymity.
Collaboration Or Competition
The technology and services companies I referred to as the Z-axis in my original piece – including IBM, Adobe, Salesforce, Oracle, SAP or even Amazon and Google – continue to aggressively and acquisitively execute on their strategies to deliver on infrastructure, data and services. None have moved more aggressively than Oracle, which acquired Datalogix, BlueKai, AddThis, Crosswise and Maxymiser, among others. As Oracle, Salesforce, IBM, Adobe, SAP and others move more forcefully into the media space, the key question is whether they want to partner, coexist or compete with the largest agencies.
The holding company agencies also are focused on building out converged, data-driven solutions. One agency executive stated that agencies’ long-term value was in managing, interpreting and activating data on behalf of clients. The first iterations of these solutions were trading desks: WPP’s Xaxis or Publicis’ VivaKi Nerve Center.
Then, these vertical standalone organizations and solutions were horizontally integrated into the operating agencies as capabilities. The upside is that the teams and capabilities are broadly spread across the agencies and more transparent. Hearts & Science, for example, won major accounts on a transparent, data-centric and deeply integrated vision.
The downside is that major brands may view agencies as differentiated commodity services, put their media in review with greater frequency and bid them down. This makes it challenging for agencies to invest significantly in convergence, data activation and people, especially in a time of such great change.
The Bottom Line
In a world of first-party, third-party, personal, census, anonymous, panel and pixel data, advertisers and publishers can have a far deeper understanding of consumers’ awareness and interests while enjoying short- and long-term profitability of their brands. Delivering on this vision requires extensive data infrastructure and a deep understanding of advertising, media publishing and ecommerce that many brands and publishers still lack. The advertising, marketing and content technology ecosystems will continue to be funded with massive capital because the opportunities for innovation and disruption are huge.
And across all the ecosystems, the consolidation, standardization, interpretation and activation of data will continue to drive the strategies of most companies. The winners and losers will be decided on how well they can enable massive transformation at materially lower costs.
Advertising and publishing companies that don’t have a clear data strategy will be disrupted by companies that do. If you want to understand who must leverage data more effectively, visit the LUMAscapes – every single one of them.
This post was syndicated from Ad Exchanger.