Doug Garnett’s Blog

Menu

Don’t Test Whispers

Don’t Test Whispers

Key to marketing success is a disciplined approach to testing ideas and action. After all, marketing writing and consulting is filled with tremendously attractive and detailed theories about action “X” causing result “Y”. Yet all these theories were developed based on specific experiences under specific circumstances. So there’s no guarantee that taking them and applying them in your world will create the same result.

So we should test, test test. And yet…testing experience shows that far more things are tested than are found to conclusively help or hurt. Why? One quite common testing error is to “test whispers” – small changes that simply can’t have a large enough impact to drive measurable change.

I once watched Rubbermaid test whispers in focus groups where a series of 5 statements of brand differentiation were evaluated. But rather than vary the statements with ideas that were truly significant to consumers, the statements traded off tiny wording changes. (I found it ironically enjoyable to watch the focus group participants quite frankly explain that all the statements said the same thing.)

Focusing on testing important things is even more critical in retail markets. In part, a mature market is extraordinarily noisy. And changes in the retail world are never isolated enough to be perfectly separated from other actions – often actions we might not even be aware of like promotional choices made by a local retail store manager.

The statistical variations in normal day-to-day activity are often far larger than the size of change we’re hoping to test. We hired a statistician to evaluate some results a decade ago or so for a tool client who sold through Sears. The size of change that retail merchandising efforts produced (e.g. appearing in the Sunday circulars) far outpaced any of the changes we’d see from advertising spending (which was a long-term & constant effort). The statistician identified impact from the advertising, but in order to damp out the impact of the merchandising efforts he could say no more than “there was impact” because he had lost all valid ability to quantify the size of the impact. (Reminded me of my qualitative vs. quantitative chemistry work in school…)

One might think testing becomes far different when we shift to direct response marketing. It doesn’t. Even in DR one has to be careful to test actions that are significant enough to generate a change you can measure with confidence.

So read on in my direct response television based article from this month’s Response Magazine. (This article is available below.)

Testing offers tremendous organizational advantage – especially responding to political attack with knowledge gained from testing. So embrace testing. And focus your testing efforts where you can successfully evaluate the test results.

Copyright 2013 – Doug Garnett – All Rights Reserved


Don’t Test Whispers
By Doug Garnett, Founder & CEO, Atomic Direct

DRTV campaigns draw tremendous strength and power from the ability to measure phone and web response.

But this strength creates some amazing DRTV mythologies; with some agencies promising laser-like abilities to test every creative or offer change.

This is the first of three articles I will write to debunk testing mythologies in order to search out more powerful test realities. Let’s start with this mythology of testing creative changes with laser-like accuracy.

The Myth

This mythology doesn’t affect the initial campaign test – where we look at big issues like “did the phone ring”. It comes up when we “tweak” campaigns to improve results.

The mythology suggests that, somehow, media results (MERs) are so stable that testing can detect the impact of even the tiniest tweaks in advertising. Average MERs are stable over the long run. But they can be quite fickle in the short run.

The Math

We recently tweaked a long running infomercial and found it difficult to detect direct impact from these changes. Looking carefully at past and present results it all became clear: changes to the week-to-week MERs were far bigger than the impact we were trying to detect.

In fact, this MER changed so much, that it had a .25 standard deviation. Given an average MER of 1.80, all we can assume is the high likelihood that our next MER will be between 1.55 and 2.05 – that’s a full 0.50 range – far larger than a hoped for 10% MER increase.

What this means is that for creative tweaks to be detectible, they have to change the MER by 20% or more. Anything less is just a whisper.

DRTV’s Challenging Statistics

To understand what’s going on, consider this airing: Food Network, at 5:30 am, on Monday Feb 18, 2013. The airing is entirely unique as there will never be another airing with the same audience profile – which is driven by news events, viewer preference shifts, station competition, etc.

Learning from Direct Mail

DRTV is far different from direct mail (where they also “don’t test whispers”). In a way, direct mail teams have it easy. They can test an A/B creative split by sending each creative to 5,000 individuals.

By contrast, in an entire year of DRTV, there are only 52 Food Network airings at 5:30 am on Monday. So can your agency get “laser-like” accuracy from testing creative tweaks with 20-30 airings? No.

Whispers That Strain Credibility

Don’t let anyone tell you that “Your price was $19.99, but the spot would test better at $19.95.” There are reasons $19.95 might be smart but testing will never prove it.

And take care with anyone who says, “We changed five words and now the show works.” Five words? The only way this isn’t a whisper is by changing “you’ll pay $1,000 ” to “yours free; just for calling”.

Some Suggestions for Testing

To tap into DRTV’s power, you need a clear testing vision. Allow me to suggest a few things:

1. Keep your expectations in line – the only changes you can detect in testing are big ones.
2. Test for longer periods to get better accuracy – perhaps as long as three months.
3. Resist the temptation to assume too much from too little. Testing is expensive. But so are mistakes.

Conclusion

It is ironic that we hear this mythology of test accuracy from an industry that still doesn’t have a solid understanding of the power of market research. But don’t let that scare you off.

DRTV delivers tremendous power for marketers, manufacturers, entrepreneurs, and retailers. Learn to be humble – and succeed. DRTV has the capability to deliver great results to companies who treat it wisely.

Categories:   Advertising, Business and Strategy, Communication, consumer marketing, Consumer research, Digital/On-line, Direct Response, DR Television, Hardware & Tools, marketing, Marketing Research, Research & Attribution, Retail, Technology Advertising

Comments

Sorry, comments are closed for this item.