Opinion

How to fight survey fatigue

From planning and implementation to customer experience and data assessment, Chris Breslin explains how to create a marketing survey that delivers on more than just a high open rate.

“I used to like their customer service – until they kept asking me if I liked their customer service.”

Chris Breslin_Confirmit

The increasing role of customer experience is threatening, ironically, to turn into a bad experience for customers who are now facing a perfect storm, flooded with survey and feedback requests from everyone from their pet store to their local politician.

Over-eager, approval-seeking web forms and emails, follow up text messages and robotic voice calls – all clamouring for your opinion.

Do you like me? Why not? What did I do wrong? What can I do to make it up to you?

Professional market researchers now have to compete for the public’s time with this myriad surveys, some promising bonus points or entry into competitions, others just demanding 20 minutes of your time, repeatedly.

3d-grid

Click to enlarge

Further complicating matters is the proliferation of free web surveys anyone with an internet connection can send out to harass an already survey fatigued society.

To be clear, there is nothing wrong with businesses using a Voice of the Customer programme to improve the customer experience – it’s a great thing. But, there is a knock-on effect and as market researchers, we need to respond to that.

A study conducted by the Pew Research Centre found the response rate of a typical telephone survey dropped from 36% in 1997 to just 9% in 2012 and was continuing to drop.

Other anecdotal evidence has the response rate for market research surveys in general dropping from 20% to just 2% in the past 20 years.

Effective management of customer experience requires an innovative approach to overcome this survey fatigue, avoid frustration and ensure the integrity of the data collected.

With relevant samples, careful survey design and appropriate incentives we can offer respondents a more tailored and attractive experience.

Online customer service satisfaction survey on a digital tablet

The Sample

Some of the organisations we speak to use their own samples while others use external samples. Surveys that you distribute internally (i.e. to employees) generally have a much higher response rate than those distributed to external audiences (i.e. customers).

  • Internal surveys will generally receive a 30-40% response rate (or more) on average
  • External surveys receive an average 10-15% response rate
  • Surveys sent by market research companies are considered surveys for external audiences. The response rate is between 10-15%grid-2

The different motivation levels of these two audiences has a lot to do with the big swing in response rates. Below are some best practices regarding samples:

  • Invitation channel: Check with the sample provider about the invitation channel previously used with the selected sample. If the survey is intended for a desktop device, ask your panel provider to confirm the completion rate previously achieved using desktops. This benchmark is important if you have designed a survey for mobile respondents. Panel providers can verify and confirm that the panellists selected in the sample have had a high percentage of completions using mobiles, or that the respondents have completed previous surveys when the invitation channel was SMS, for example.
  • Device type: Use scripting at the beginning of a survey to check and log the device type and check the respondent is using the device type for which the survey was designed. For example, if the survey is designed for laptop and they are using mobile, you need to ensure it’s optimised for mobile.
  • Recent response rates: Check with your panel provider regarding response rates of individuals over the past month, 3, 6 and 12 months. You want to know if their behaviour shows they are taking part in surveys.motivation
  • Segment respondents: As no participant is the same, consider storing behavioural information to help segment panellists. To do this, you will need to own the sample or be involved in a long term tracker study but it can yield useful insights if you’re able to do so. Your participants will all have their own different behavioural patterns and you can use different ways to groups these behaviours.
  • Motivation: People have different motivations for taking part in surveys. These can be either extrinsic or intrinsic.
    • Extrinsic: Those who are more extrinsically motivated are incentive driven. This typically makes up about 10-20% of respondents
    • Intrinsic: These people are motivated as information seekers or they may be power seekers – they want to influence the brand. For example, for something they use regularly, they may want to influence and improve it.

Businessman using laptop computer

The more you know about your audience, the better you can shape the survey and make it more relevant to them – and therefore it is more likely that they will be motivated to take part in it.

A whole host of information can be stored as big data relating to your surveys – and specific to each participant, so you know when people are most likely to open surveys, click through statistics, how many reminders they need and so forth, helping you hone your survey. Then when it goes out, what incentives you need to provide based on this information.

The Survey

Once your preparation is done on the background to your survey, let’s look at some recommendations as regards the survey itself.

  • Survey invitations:
    • Time of day: It is important to establish the time of day best for respondents. A popular time to respond can be roughly between 5pm and 8pm or when they are commuting to and from work.
    • Email Subject line: This is important. You don’t want participants to think it is spam.  Think about the wording in your titles carefully. You can consider something like, “Please provide feedback about…”

Short and to the point: Keep the invitations short and make sure you include the purpose of the survey, duration and the reward information.carousel

Consistency: Respondents are creatures of habit. Consistency in the look and feel and use of the same template means respondents will get used to your invites.

  • Value respondents’ time: Make it clear that you value their time and willingness to complete the survey.
  • Survey length: Surveys are getting shorter and shorter. Some are 20-25 minutes, but many for online and mobile are between 3 and 8 minutes. The response rate is likely to decrease if a survey takes more than 10 minutes. So keeping surveys shorter will give you more chance of success. Consider making mobile your common denominator when planning the length of the survey.
  • Survey design: Put yourself in your respondents’ shoes. Think about the layout. How can you make it more enjoyable for them? Wherever possible, use piping of previous answers to make the survey more engaging.
  • Pre-loading of data: Not all of your questions may apply to all your respondents. For example, you may be surveying doctors but know that not all drugs are relevant to all doctors. You can preload the information that was provided in the previous wave and ask the doctor to validate the information and ask them to update it if there are any changes. This can help reduce the time it takes to complete the survey and keep them engaged.
  • Avoid certain types of questions: There are some questions that are better to avoid, as they can lead to drop-outs and lower data quality. Examples that can have a higher drop-out rate are:
    • Multi-grid questions: A large number of brands with a large number of statements can lead to a higher dropout rate. The recommendation is to have a maximum of 8 statements with a small number of brands (5 to 8 ideally). This is especially so, if the participants are using mobile devices, since scrolling on mobile devices will be reduced when you shorten the list of statements to 8 from a large number such as 40.
    • 3D Grids and open text grids: Overly complicated matrix questions can look different on mobile devices and therefore lead to higher drop outs and lower data quality. An approach is to split the grid into 2 or three pages
    • Too many open questions: You will gain more value if you keep these to the beginning of the survey and limit to 2 or 3. The idea behind it is to capture the valuable insights from the open questions at the beginning of the survey when the participants are fresh
  • Shorten questions: With social media and short Twitter-style sentences becoming the norm, try to make questions 140 characters or less if possible. This forces a move away from academic language, towards easier-to-understand language. Ask the types of questions you would ask if you were in front of the person, rather than longer, complicated ones
  • Dive directly into the survey: Eliminate warm-up questions and check that all the questions are linked to a business metric. Adding 2 or 3 filler questions just makes the survey longer.
  • Use multimedia and gamification: Make questions as engaging as possible. Use audio and video questions and try using video or audio to replace open text answers. You can also consider audio to read out the questions or brand names. This is a good option if you are doing product name testing.itunes-gift-cards

Incentives

As a rule of thumb:

  • Long surveys (40 minutes) always need rewards
  • Short surveys (3-10 minutes) may not need a reward, if respondents are engaged

As such, it is important to vary incentives, based on the target groups and correlate them to the length of the survey. The longer the survey, the higher the incentive. And harder-to-get groups need higher rewards.

If you need a quick response to a survey, give higher rewards for the first, say 200 respondents of the survey.

A variation in the incentive amount can be distributed across the sample. Certain target groups are harder to engage and need a higher incentive or reward. If a sample is made up of a combination of target groups, consider increasing the incentive for the group that may need a higher incentive.

A variety of incentive options seem to work. For example, younger people may be more interested in Spotify or iTunes vouchers while older respondents may be more interested in lotto tickets or competitions.

Please note that depending on the study, there may be legislations and codes of conduct that fixes the maximum incentive that can be offered.

So can we ever create a world in which respondents are so engaged that survey fatigue is a thing of the past?

Perhaps not. But a combination of best practices can certainly reduce the impact significantly and ensure the future of the survey remains bright.

Chris Breslin is the manager, Australia New Zealand, at Confirmit

ADVERTISEMENT

Get the latest media and marketing industry news (and views) direct to your inbox.

Sign up to the free Mumbrella newsletter now.

 

SUBSCRIBE

Sign up to our free daily update to get the latest in media and marketing.