IE 11 is not supported. For an optimal experience visit our site on another browser.

Do Digital Ads Work? Who Knows.

With all the data available and all the money being spent on digital ads, it should be easy to decide how effective they are. Only, it isn't.
/ Source:

It's time to rethink how we measure the effectiveness of digital ads.

Digital ad spending continues (unsurprisingly) to grow faster as a percentage of the total ad market than other categories. Research firm eMarketer says budgets for digital ads will hit $42.5 billion in 2013 and jump to $60.4 billion in 2017.

You would think, with all the data available and all that money being shelled out, it would be easy to see whether digital advertising campaigns work. After all, you have click-through rates, page traffic and online sales to use to put together a matrix of numbers to decide whether it was worth it to buy a display ad, or something of that ilk.

But, actually, we might be looking at the wrong numbers, or, at the very least, looking at the right numbers wrongly.

A working paper from three researchers who once worked together crunching numbers at Yahoo suggests we need to reset how we view tradition metrics used to judge effectiveness.

Here are the three problems with our current approach, according to the authors, Randall Lewis and David Reiley, now at Google, and Justin Rao, now at Microsoft:

1. Just because people don't click doesn't mean they don't buy.
If you are looking at click-through rates of ads, and matching that to sales, you are missing a big audience: The customers who see your ad, think about it, and then buy later, often in a physical retail location. Leaving out these customers often underestimates the effectiveness of your ad campaign. Trouble is, these are a bit tougher to capture, but the authors note there, with the incorporation of some third-party measurements, it is getting easier to figure how many of these buyers are out there.

2. Just because they click doesn't mean they bought something because of the ad.
When you look at click-through rates, you generally shout “Huzzah” when you can directly tie a click on the ad directly to a purchase. The ad agency pops a cork, saying it was clearly their creative that did the trick. The marketer assumes it is the quality of the product. The host site pats itself on the back for having such wonderful associated content, it just makes readers want to spend. All may be fooling themselves. Sometimes people just want to buy stuff. So they search for stuff. And then they happen upon your ad and see and easy way to do it. They were buying anyway. You got lucky.

3. Just because you have data doesn't mean they don't suck.
Well, the authors didn't quite put it that way. Instead, they said, “(M)ore sophisticated models that do compare exposed to unexposed users to establish a baseline purchase rate typically rely on natural, endogenous advertising exposure and can easily generate biased estimates due to unobserved heterogeneity.” While at Yahoo, the authors found that studies of buyers vs. non-buyers were put together wrong, with a lot of bad data and noise.

So, what is a marketer to do? Well, many ways, click-through rates are already being discounted in many ad buys. Last year, researchers from Hewlett-Packard, using their own digital display ads for printers, found that click-through rates were too random to be used as a metric for effectiveness. Instead, they only used them to compare whether one set of creative did better than another – and even then they admitted they didn't entirely trust the numbers.

That leaves other approaches, like gauging reach, engagement, brand awareness, traffic to your site, and stuff like that. These are often more difficult, and more expensive, to monitor, than the cheap-and-easy click-through, but they have an advantage: The data actually may be useful.