April 26, 2024

Programmatic

In a world where nearly everyone is always online, there is no offline.

Data For Data’s Sake

<p>AdExchanger |</p> <p>"Brand Aware” explores the data-driven digital ad ecosystem from the marketer's point of view. Today's column is written by Belinda J. Smith, global director of media activation at Electronic Arts. Belinda will present "EA's Programmatic Arts" at AdExchanger's upcoming PROGRAMMATIC I/O New York conference on October 25-26.    Having been in the biddable/programmatic space for more than 10<span class="more-link">... <span>Continue reading</span> »</span></p> <p>The post <a rel="nofollow" href="https://adexchanger.com/brand-aware/data-datas-sake/">Data For Data’s Sake</a> appeared first on <a rel="nofollow" href="https://adexchanger.com">AdExchanger</a>.</p><img src="http://feeds.feedburner.com/~r/ad-exchange-news/~4/m2TBTdLPloM" height="1" width="1" alt="" />

Brand Aware” explores the data-driven digital ad ecosystem from the marketer’s point of view.

Today’s column is written by Belinda J. Smith, global director of media activation at Electronic Arts. Belinda will present “EA’s Programmatic Arts” at AdExchanger’s upcoming PROGRAMMATIC I/O New York conference on October 25-26.   

Having been in the biddable/programmatic space for more than 10 years now, I have spent a lot of my time focusing on that infamous “o” word: optimization.

I have had more than one job where my entire purpose was to test every variable and combination thinkable to find the most efficient way to get ads in front of an audience and maximize ROI. The point was to collect as much data about every single thing under the sun to help marketers succeed.

When I think about this approach it reminds me of the “Rime of the Ancient Mariner”: “Water, water, everywhere / Nor any drop to drink.” That is to say, as an industry we are so focused on optimizing, testing and getting more and more data that we are now drowning in data without a sense of what we’re really learning from it.

Programmatic was built on the promise of being one of the few ways to help marketers truly understand “which 50%” of their media is waste. And while it has enabled us to test and learn in a way we never thought imaginable, we’re not devoting nearly enough time or energy to thinking about what to do with all of this new data to improve our business outcomes.

With the emergence and growing accessibility of, and interest in, machine learning, augmented or artificial intelligence and the ubiquitous “algorithm,” marketers have become absolutely obsessed with optimizing, testing and reaching unimaginable efficiencies with ever more sophisticated and nuanced media-buying strategies.

Consider that there are multimillion-dollar firms that do nothing but show ads to people for products they’ve already looked at on a website. They are able to profitably host yacht parties at Cannes and rent halls at Dmexco by doing nothing but testing the right time and way to remind consumers that they’ve already looked at something!

Even more fascinating for me to watch has been the growing popularity of the nebulous concept of incrementality. Incrementality includes paying for your target audience to not see your ads so that you can (hopefully) determine if you’re showing ads to people who were going to convert anyway or if showing your ad was a factor in their decision to convert.

And while this approach is sophisticated, complicated, costly and intensive, the data you get back is also perplexing. Someone recently told me they ran an “incrementality” test (which was actually the definition of an A/B holdout test, but that’s a topic for another day) that showed 20% lift. They were thrilled because the other tests they’ve run had only showed a 3% to 8% lift. And as a marketer, my first thought was, “Did you just tell me you’re excited about a test which suggests 80% of your budget is waste?”

When I press people to ask what they’ve changed as a result of this type of testing, the most common answer I get is that the data goes into a black-box attribution model so that they can further optimize their spend. What?!

Now I can already predict some comment along the lines of, “My company tested DCO/retargeting/incrementaility/etc. and saw a 30% lift in effectiveness.” And to that I say, that’s great! So, does that mean you were able to sell 30% more of your goods or services with the same budget by applying those learnings? The answer I usually get back to that question is “Well, no.” Then what does the 30% increase in “effectiveness” mean? What action are you supposed to take with that information? Isn’t the point of understanding waste to be able to stop wasting?

I’m not saying we shouldn’t test or optimize or gather as much data as possible. What I’m saying is this: We need to get our houses in order. We should not be relying on tests we don’t fully understand to give us directional data we don’t feel confident in as proof we are doing a good job.

If you are running a test and would not significantly alter your marketing strategy or budget based on the outcome, why are you wasting your money on that test? If you are running a test that you can’t explain to a colleague or letting a proprietary algorithm blindly optimize your campaigns to be more efficient without understanding how that’s being done, how will you interpret the data and results and know it’s being done correctly? If you ran a test that told you your marketing is effective but your business did not see a positive increase in sales, was that data really helpful to you?

For something to be valuable it should be understandable, instructional and scalable. Similar to managing product development cycles, I have found it incredibly helpful to spend time creating a testing road map for media. That plan includes what I am seeking to learn, how I propose to test my hypotheses and what implication the resulting data will pose to my current operations and business overall.

Mapping this out ahead of time helps to better assign time and resources to prioritize the big things I want to learn. It is also a useful tool to allow others outside of my discipline to offer feedback on the objectives, priority and approach of such initiatives while tangibly demonstrating the overall value of the media program and the data coming back from it.

At this year’s ANA Media Conference in Orlando, CEO Bob Liodice reminded us that no matter how much growth eMarketer is showing in digital media, and no matter how much more effective Nielsen and comScore are claiming we are as an industry, total US sales declined 7.3% to $14.5 trillion dollars in 2016. This was the second straight year of sales declines, even as we talk about how much better we’re getting at digital. That’s the data that I’m most interested in.

Follow Belinda J. Smith (@BJStech), Electronic Arts (@EA) and AdExchanger (@adexchanger) on Twitter.

This post was syndicated from Ad Exchanger.