Pre-testing ads doesn’t kill creativity
In this guest posting, Millward Brown’s Darren Poole champions the pre-testing of ads
It’s a truth universally acknowledged that a man in possession of research must be a tosser – or so 42 Below vodka founder, Geoff Ross, would have us believe.
Speaking at the Battle of Big Thinking event last month, Geoff is reported as saying: “In my view, advertising is completely in a quagmire, multiple layers, pre-testing, post-testing. In the end nothing good is going to survive.”
I believe he is wrong.
Great ads shine in pre and post-testing, just as they hit the spot with consumers, make clients happy, and keep them coming back to agencies for future campaigns.
Of course, if the definition of a “good” ad is one that wins you the adulation of the advertising industry without regard for the target audience, then I can see why research is not popular.
In my experience, there are many clients who are willing to put their faith in an agency and go out on a limb with creative. However, before they commit to a multi-million dollar media roster and put the new creative out there for all to see, they test the ad with the target audience to make sure the emperor really is wearing new clothes.
We absolutely value the creativity that has gone into an ad and given Millward Brown’s heritage in providing research-based advice to help clients build brands and achieve cut through, the last thing we want to do is create a mediocre, homogeneous ad. Clients wouldn’t continue to use our services if we did.
Despite widespread fears that pre-testing involves a Charlie and the Chocolate Factory-style sorting room with a garbage chute for “bad nuts”, the reality is that research rarely leads to an ad being canned. More often than not, pre-testing (certainly our brand of copy testing, anyway) leads to optimisation of creative – weak ads are given the ability to be good and good ads can become great.
While none of us like the idea of someone evaluating our work (hands up who loves performance reviews?) the reality is that advertising is a most public profession and the target audience is the ultimate judge of creativity.
Pre-testing means a brand can gain insights into how an ad is going to be received by that audience.
The best ads engage, deliver the intended message, and generate some form of response. If there’s an engagement gap, off-strategy communication, or a response other than the one intended, there may be cause for concern. If the response is one of disappointment or thinking the ad is boring or irrelevant, there’s a problem with the idea or the execution.
Our job is to identify the problems and recommend ways to solve these problems. This is where our experience in testing well over 1400 ads in Australia and 65,000 ads worldwide comes into play.
I can’t give away too much detail, but some of the work we have done with CPG brands in the past 12 months has seen a soundtrack transform from something viewed as a little strange to become quirky and impactful. Another great example is where we helped characters evolve from being creepy and a bit disturbing to wonderfully eccentric. We also indentified the route to help a brand develop one of the most popular ads of the Summer, and we’ve helped make a brand’s consumer benefits more evident in more than one case.
Of course, pre-testing is not 100 per cent foolproof. I remember one case, early in my career, where we were wide of the mark because we were evaluating the wrong strategy among the wrong sample. This taught me that understanding the intentions of client and ad agency as early as possible in the creative development process is critical.
I’m the first to admit the research industry could do a better job of debunking the myths about advertising testing and I’m hoping this piece gets the conversation started. In my experience, the most common myth is that pre-testing assumes that all advertising works in the same way or is intended to have the same outcomes. Other chestnuts – to name a few – are that it’s a pass or fail test, that it can’t do emotion, and that it favours ads that are full of packshots.
To close, I’m going to quote another Battle of Big Thinking speaker, Richard Sauerman, as reported in the Sydney Morning Herald: “Consumers won’t remember what you tell them, but they’ll never forget how you make them feel.”
I couldn’t agree more.
- Daren Poole is chief client officer at Millward Brown
Questions:
How much is any target audience genuinely represented by a sample of people who have the time and inclination to spend their time getting involved in focus groups in the first place?
How impactful can a cheap mock-up of an ad ever really be?
Assuming the above two issues can be over come, how do you set about gleaning meaningful responses from people when they are sat in an alien room with a bunch of strangers and a mirrored wall staring at them?
And why isn’t involving research early on in the process – to inform the brief – enough comfort and direction to avert pre-testing of the resulting creative work?
User ID not verified.
If it worked no ad would fail.
Clients continue to use your service so they cover their ass up-stream.
Many of the world’s most successful advertisers and marketers don’t use your kind of testing.
Many of the world’s most successful ads (in terms of sales) wouldn’t pass your so-called test.
We know you have a business to flog. It’s just a crappy, miserable one.
User ID not verified.
Andrew, you have a valid point.
A lot of clients may use the test as part of a KPI, or a benchmark for their media agencies.
To Darren’s point – he hasn’t really mentioned a ‘pass’ or ‘fail’. It’s about figuring out if the ad does what it’s meant to be amongst the people you want it to.
No point throwign several million at an ad that doesn’t say anything about you, doesn’t really get people liking you, nor makes you want to do anything as a result of seeing it.
It would defiantely be interesting to see how the world’s most successful ads do fair in such testing.
But Andrew – my question is what sort of ad do you consider successful?
Does it drive sales? Then let’s all invest in Brandpower ads and tell people how wonderful our product is so they can buy it and never think of us again.
Does it make people send it around the internet and have a giggle? Great, let’s spend a fortune to entertain people – might as well make a movie and drop the brand altogether.
Millwardbrown, or from their corporate blurb, works with many a large proportion of the top brands – so Andrew, I guess your comment of ‘many’ applies to less than a quarter?
User ID not verified.
darren – fire your proof reader!
out of focus – pre-testing is rarely done with focus groups. if it is, probably why you’re so negative
mumbrella – how much were you paid for this advertorial?
User ID not verified.
Research is a part of life if you are creating expensive ads, especially TV.
Rather than bitch about it, I suggest creatives attend more groups to get a better feel for the consumer (when it’s qual. and they let you go that is).
I’ve saved several ideas by quoting a consumer.
User ID not verified.
Hi “Flogging”,
To answer you directly (although I’m sure you realise the answer really) – nothing. Personally, I think it’s quite an interesting topic that’s worthy of discussion. Don’t you?
Cheers,
Tim – Mumbrella
The value of testing is so overstated, it’s ridiculous really. The methods used to test would be laughed out of any respectable science department. And, at its heart, what testing does is ask a person for a rationale response to what is, more often than not, a gut reaction – something emotional and subconscious. For instance, ask women if they want to see the ‘average’ woman represented in mainstream advertising and they almost always say they do. And they almost always ignore it whenever it’s tried. One rarely correlates to the other.
And yet testing is sold to clients as rock solid research. It’s simply not. At best, it’s a loose guide to what someone might think, maybe. But clients are told to take this research as gospel. Ads are changed (sometimes wholesale) accordingly. And what you end up with is a focus group re-directing creative strategy and work without any idea what they’re doing, or why. Is there another industry where such a moronic process is seen as effective?
As Andrew suggested previously, if testing was really that good no ad would fail. The real truth is that it’s extremely hard to predict actual human behaviour. Testing proports to provide insight, but it rarely does.
User ID not verified.
Pre-testing ads doesn’t kill creativity… rappers do
User ID not verified.
Frank, if they tested the actual ad I’d feel slightly, just slightly more comfortable. As Out of focus mentioned they rarely do.
Here’s a test for you, get a story board artist to whip up some stick figures eating cake then ask your wife if she she likes her wedding photos.
How much has Steve Jobs, Phil Knight and Richard Branson paid MB over the years…??
If you are a client step up and put your opinion on the line, don’t hide behind some pseudo wank.
How about MB start with testing their owner WPP to find out why my fucking share price is so low? It’s just another revenue stream playing on insecurities for a big ugly holding company.
User ID not verified.
I applaud Darren for saying this: “I’m the first to admit the research industry could do a better job of debunking the myths about advertising testing and I’m hoping this piece gets the conversation started.” Sincerely, I do.
[rant]
The challenge for the research companies is that new, better science is revealing that most market research methodologies are out-of-date. Flawed. Based on a 1950s understanding of memory, recall and decision-making.
Until they grasp that nettle and stop selling their findings as ‘truth’ to our clients it’s going to be very hard to get the ad community back on board. Like a battered spouse, we have seen far, far too many good ideas beaten beyond recognition by the heavy hands of ‘objective researchers’.
That said, there are some bloody marvelous qualitative folks who definitely help make the work better, because that’s their goal/focus. Not ‘truth’.
[/rant]
User ID not verified.
Sounds like a battle between those who desperately want to be heard for better or worse, and those who desperately don’t want to listen to anybody (researcher or target audience), again for better or worse.
not the best building blocks for middle ground…
That said, agree with Tom that it definitely opens the door to smaller research suppliers who have the flexibility to do things differently (with legitimacy, not just for kicks), of which there are plenty if you look…but, ummm, battered spouse?? Might’ve lost me there.
User ID not verified.
GLC- shame on you for abusing rappers.
Shame!
User ID not verified.
Some interesting comments… and I don’t disagree with all of them…. this debate has been going on since copy-testing was invented and will go on until it ends. And WPP stock is performing pretty well …
We believe that the best optimisation occurs when research outcomes are combined with a good dose of gut feel. Relying purely on research would suggest that clients and agencies don’t have valid opinions, whereas depending solely on gut runs the risk of creating ads that only those who have been involved in their development can understand.
While we see a role for groups in creative development – the group dynamic helps build ideas – by the time we get to execution testing, we advocate one-on-one interviews. Testing executions in groups doesn’t work as we tend to process ads as individuals in idiosyncratic ways. We know that attempts to recreate real-life viewing situations aren’t very predictive so don’t do it. Instead we ask questions that we know to be linked to real life success. And you’d be amazed how many ‘normal’ Aussies, that is non-marketing types, like giving an opinion.
On the point of animatics – they work. We’re normally looking at structure and flow rather than executional detail. But if the ad will rely on certain shots in the finished film, agencies can cut some footage in.
Ultimately, to Frank’s point, we are looking to help create effectiveness. There are still too many ads that while beautiful pieces of production, talk to the marketing elite and fail even to be understood by a wider audience.
User ID not verified.
@Daren
“There are still too many ads that while beautiful pieces of production, talk to the marketing elite and fail even to be understood by a wider audience.”
I have to agree with you whole-heartedly there.
User ID not verified.
Daren, the WPP stock price is lower now than it was in Feb of 2007. If you are happy with that it makes sense you embrace this form of testing.
User ID not verified.
A horse designed by a committee – you should stick to to…er equestrian issues.
It’s idiotic to compare any share price today with a price in Feb 07, before the GFC or threat thereof wiped 50% off the US sharemarket.
Tony Richardson and Scott Taylor are on the money with their references to an industry that is desperate not to be measured
The ‘creative’ part of the ad industry is an anachronism that needs to be dragged into the 21st century. No method of measuring its efficacy will be fool-proof but the option is far worse – clients like me simply stop spending money on advertising because we’re sick of being spun to by frustrated ‘artists’, slick creative directors and media strategists who peddle their self-serving versions of amateur psychology as some form of pseudo-science
User ID not verified.
@sven,
BHP in Feb 2007 was $26.59
Now $42.80
whilst facing the same economic conditions.
Apple who doesn’t link test:
Feb 2007 was $83.27
Currently $247.15
Again, after riding through the same economic storm. Pay a bit more attention and you can pay off that used Audi.
WPP has lost money over the same time while Martin Sorrell has negotiated at $96mill US windfall for himself.
I agree that there are a number of creatives and agencies creating art for art’s sake in this country. The problem lies with global alignments. Agencies here can’t win or lose business so they are free to act irresponsibly with no consequences.
A creative has no financial incentive to grow a client’s business in Australia. They are compensated by winning awards. Thus the root of the problem of using multinationals.
However, get an independent agency owned by the people working there and you will see a whole different level of client care and respect. One of the major reasons why Droga5, Three Drunk Monkeys and Host are now bigger than most holding company outposts.
User ID not verified.
“A creative has no financial incentive to grow a client’s business in Australia. They are compensated by winning awards. Thus the root of the problem of using multinationals.
However, get an independent agency owned by the people working there and you will see a whole different level of client care and respect. One of the major reasons why Droga5, Three Drunk Monkeys and Host are now bigger than most holding company outposts.”
This is so on the money it hurts.
User ID not verified.
“strategists who peddle their self-serving versions of amateur psychology as some form of pseudo-science”
This is hilariously on the money, too.
[I’m loving this thread… but then again, I’m a massive nerd.]
User ID not verified.
@A horse designed by a committee…
I’m sure Apple’s stock price increase was completely due to their ads (and lack of pre-testing) and nothing to do with the 2007 iPhone launch…
There is a lot of good research and a lot of bad research out there – and similarly there are a lot of good and bad practitioners of research. And then a lot of research being used by people who don’t know how to properly interpret and use the results!
We run into problems when we expect consumers to make our decisions for us. Again, as Darren says ” Relying purely on research would suggest that clients and agencies don’t have valid opinions” – we can’t ask consumers to make our decisions for us, and them blame them if they fail. Pre-testing (or any research) should be use consumers feedback as part of the total picture before making a decision. The consumer feedback is the “input” not the “output” of good research – it is our responsibility to use the input appropriately.
User ID not verified.
@Victoria I never suggested that Apple’s success (or BHP’s) was “completely” due to their advertising.
Just pointing out to Sven that not all stocks are underwater like MB’s holding company is. If he’s happy talking a loss for the last few years I guess that just makes it easier for other investors.
User ID not verified.
Daren,
Millward Brown are no more likely to predict the outcome/effectiveness of an ad than me.
As someone in one of the earlier blogs said, most of the techniques [like Link-test] etc., wouldn’t get past stage 1 of any scientific scrutiny.
Your methodologies are educated guesswork at best.
I’m more than happy to embrace market research if its grounded in indisputable science. If it’s just going to be a matter of opinion I’ll stick to mine.
User ID not verified.
@A horse. Much like sven didn’t say that “all” stocks were underwater…
User ID not verified.
Just a thought – based on past IPA effectiveness winners, non-pre-tested campaigns have a significant advantage as far as campaign profitability goes, as well as a higher success rate as defined by Les Binet & Peter Field in their analysis available on WARC.
Not gospel, obviously, but an interesting read.
User ID not verified.
@horse, quit while you’re ahead, you’re sounding like a horse’s a**.
you weren’t saying that all stocks aren’t underwater like WPP – you don’t like ad testing and were inferring that WPP was underwater because it owns businesses involved in ad testing
this is simply nonsense, as is any discussion attempting to find a relationship between the share price movements of Apple, BHP and WPP. They are in completely different industry sectors, operating in different regions and with very different corporate stories.
Apart from Apple’s iphone and ipad (and confirmation that Jobs isn’t dying), as you may or not be aware, Chinese demand for Australian ore has kinda been the biggest business story for the last couple of years. Oh and there’s the residual potential for BHP and RIO to merge..or for BHP to be taken over by a China-state entity.
any WPP stock price comparison only makes sense relative to an Omnicom or similar, so why dont you crunch the numbers and come back to us. Don’t forget to include some sell-side research or buy-side commentary which supports your contention that institutional investors hate ad-testing too.
i’m happy to come around and help you. I will probably drive my used Audi R8 if my wife is driving the RS4 wagon. I can pick you up from the station after your last uni lecture, if you like..
User ID not verified.
@tweebs. Do you have a link for that analysis? I’d love to read it (’cause I’d love it to be true). Hitting Google now…
User ID not verified.
http://store.warc.com/DisplayS.....ductID=647
Got it. 75 quid… isn’t everything on the interwebs supposed to be free?
User ID not verified.