Should we trust consumer research?
In this guest post, James Wright questions whether marketers buy consumer research for the right reasons.
Eighty-five per cent of all research is made up on the spot, or so goes the old joke. Pick up any newspaper or visit any news website and you will find some sort of frothy research insight into attitudes and behaviour. Recently there have been a number of investigations into how accurate consumer research is. ABC’s Media Watch recently devoted a program to looking at the behaviour of McCrindle Research and the most recent piece appeared yesterday in the Sunday Telegraph looking at research company Canstar Blue. The underlying question being, can we trust research?
Canstar Blue provides research ‘by consumers for consumers’. So one would imagine that the consumer is its most important stakeholder, and you would further believe that by positioning itself in this way as the authority ‘for consumers’ that its research can be completely trusted. So is it?
Let’s look at what Canstar Blue has been allegedly pulled up on. The article focuses on a piece of research into coffee shop chains in Australia where McDonald’s McCafe came out on top for ‘customer satisfaction’. This came as somewhat of a surprise to many judging by the comments on blogs, which in part led to the Sunday Telegraph investigation. It is worth noting that only three months previous McDonald’s had apologised for the quality of its coffee. Canstar Blue says it undertook the research independently and then sold it to McDonald’s. You might think that this raises a number of ethical questions but it isn’t unusual. Many research companies do this. However, Maccas really did love it. A media release was issued and the brand proudly showed off its top rating in its TV adverts. But when you dig into the research some far-reaching questions arise.
The first is that the claim appears to say that 2,500 people had been surveyed, yet it turns out only 1,700 qualified to actually answer the survey. Why did Canstar Blue not make this fact far clearer? After all 1,700 people is more than enough to provide a strong result. A further point is that it doesn’t appear to be comparing apples with apples. Asking customers of McDonald’s about ‘satisfaction’ is very different to asking the same question to a customer at the premium end of the coffee chain spectrum. It is like asking a customer of Big W how satisfied they are and then asking a customer of David Jones, they are very different and therefore their expectations are very different. In the automotive industry you would only ever compare your car to another in its class.
However, the biggest issue is that Canstar Blue wouldn’t share its data. If Canstar Blue is 100% confident in the methodology and raw data surely it would happily provide the full breakdown and stand behind it. The results are already being promoted, where is the sensitivity? By not doing this it all looks like there is something to hide.
It leads to the question, what might be wrong with the data? If you look at how the research was undertaken people only had to have visited one of the coffee shop chains it listed over the past six months (and remember only three months previous McDonald’s admitted that their coffee wasn’t up to scratch). So what if only 100 of the 1,700 pool had visited Hudsons or The Coffee Club, would that be representative? Canstar Blue won’t release the numbers so we don’t know, but if the numbers were as low as this then surely they can’t be representative, they have thousands of people through their doors every day. Worryingly, they admit that a coffee shop chain only had to receive 30 responses to qualify in the research.
Marketers have always spun research to their own advantage. Political parties are particular protagonists. How many times have we seen a piece of health or economic research that gets argued as a success or failure from two different stand points? The fundamental difference is that we will be given access to the raw data and therefore they know that if they dare argue a point that can’t be justified in some way that they will be found out.
The Sunday Telegraph investigation is quite revealing and raises issues that shouldn’t just be applied to Canstar Blue, but more broadly to consumer research companies. But let’s stop for a moment and ask ourselves as marketers the question: when we commission a piece of research are we primarily trying to find out something revealing about a consumer or are we primarily interested in creating a marketing story?
If we are honest it is mostly the latter.
The media has also to shoulder some of the blame, they love surveys, if they didn’t we wouldn’t bother trying to get them published. So we all can take a little responsibility in creating an environment where these types of investigations are undertaken. In fact it has probably been a long time coming.
James Wright is the general manager of Red Agency.
Proper research is great. You know, the stuff you usually do yourself. Market research is a scam. And lazy. Especially concept testing, which is an absolute farce.
User ID not verified.
Hi James
Good piece and you are right in stating that the media (and their readers) love this. They do love surveys and will quite happily eat up anything that will give them a 1-in-10 of Australians headline.
As long as the research is done properly and you are honest about the results- I really don’t see any problem with it.
I do use surveys to create a newsworthy angle in some of the areas my clients work in. I deal mainly in raising awareness of health issues, and these issues stay the same every year but the need for raising their awareness doesn’t deteriorate. These surveys, like them or hate them, are a good way of creating a new angle to get past the news desk.
We always make sure that our surveys are transparent. While we will issue a media release and key statistics, we will provide copies of the raw data to interested journalists so they can verify what we have given them- or use the data to find their own take on the research.
User ID not verified.
Without research, Todd “they discovered” Sampson would have nothing to contribute on Gruen.
User ID not verified.
At worst someone will buy a coffee at maccas- once.
Otherwise it’s a meaningless claim and therefore not incorrect to publish regardless of the survey methodology.
User ID not verified.
Well written James.
But I was wondering whether the motivation for such an article might have been driven by your / Red Agency’s client – The Coffee Club ? CC founder & director John Lazarou was quoted in the story “slamming” the survey methodology..
Might be a coincidence but it did make me curious as to why a GM at such a large agency would opine in detail over a fairly minor story.
*Please post my comment. I am merely asking if there is transparency here or an agenda, I am not accusing James of any wrongdoing.
Regards,
MK
User ID not verified.
Great article. As a member of the Marker Research industry, I find it unfortunate that not all market research follows the guidelines and principals set out by the Australian Social and Market Research Society (AMSRS).
I fear that the “democratisation” of market research (ie, as barriers to usage lower thanks to Survey Monkey etc) will only make things more difficult to manage.
As for MR being “a scam” 2Gs – as I’m fond of quoting: clients get the research (and the agencies) they deserve…..just sayin’!
User ID not verified.
The Canstar Blue awards are not market research. There is no data; no transparency. They slap surveys together, award big name brands and then go after them hard for money to use the Canstar award logo. Canstar gets money and, because of the participating brands, they get credibility. It’s laughable seeing companies use these logos in their TV commercials given how unethical and insubstantial Canstar awards really are. For the sake of consumers, it’s good to see the veil lifted on their operation.
User ID not verified.
Brand equity does not sit in the boardroom anymore. It sits in the hearts and minds of the people who buy, love, dislike brands.
As marketeers we must try to understand this by using some kind of insight tools… there are some great ones and there are some sh*t ones. There is also some great insight developers and some terrible ones.
Lets not slag off a whole industry because a few are not that good at it.
User ID not verified.
Agree on a lot of the points here (e.g. a lot of ‘research’ isn’t really all it is cracked up to be).
I do have one question for James though – I notice that you guys just released the 15th Annual World Wealth Report. Would you be willing to release all the data used to compile this research and methodology for free? I’m just sayin…
User ID not verified.
if a survey of 1700 marketers was run to ask which was ‘Australia’s leading PR and Marketing Communications agency’, do you think the Red agency would come out on top?…..You can make outlandish claims without research (see the red agency website), so to blame the research incorrect.
I’m sure, as if often the case, it is when 3rd parties get hold of the research and try to mould a story to fit their needs, things start to veer off track…
I agree with @michael_ashnikov – motives for comment sound a bit fishy to me….
User ID not verified.
I am hoping the big agency that the author of this piece works for is a travel agency, but it is more likely to be an advertising or marketing agency. Some of the very organisations who have been instrumental in dragging the quality of research down.
The data that is used to generate exciting headlines and media stories is rarely based on substantitive or high quality research. Marketers realised some time ago the PR mileage that can be generated from interesting market research findings. Unfortunately some of the less scrupulous members of the Market Research industry have done the same.
This quote from the article tells me everything I need to know about how the authors agency uses research
‘when we commission a piece of research are we primarily trying to find out something revealing about a consumer or are we primarily interested in creating a marketing story?
If we are honest it is mostly the latter.’
Your poor unfortunate clients. This tells me everything I need to know about marketing agencies and market research.
User ID not verified.
I came across this article while trying to find out how to read my electricity meter. Which by the way I am still trying to work out.
Canstar seemed like it was independent, therefore hopefully show me a clear way of understanding how the bills could be understood.
Canstar offered a questionnaire with one of the questions being “How many kWh did you use?” on your last bill – My bill amounted to over 3000kWh, Canstar show a box for you to record the kWh used, but instead of just writing the amount in the box you had to press up or down arrows to scroll through to the number you want. I pressed the up-arrow and it began to scroll to my 3,000 figure. To my surprise the ‘counter’ scrolled to ‘two decimal points’, but I persisted, thinking it has to speed up. After about 30 seconds I had reached 4.38 – still a long way from the 3000 number I was requiring. I did a quick caclulation and worked out I would need to hold my mouse button on that up-arrow for 6hrs 25min to record the number I required.
I thought for a research company this questionnaire had not been researched very well; unless the questionnaire was created this way on purpose, to help electricity companies frustrate people even more so they stop trying to research how to read their bills. I then looked at Canstar recommendations on how to select a electricity supplier and how to read suppliers bills. I found their explanation to to be of no assistance but they did seem to be favoring a certain electricity company. This led me to research how reliable Canstar’s recommendations were; which inturn caused me to check out your comment. Thankyou for you comment, it reinforced mine.
User ID not verified.