7 Hellish Ways You’re Burning Your Online Survey Respondents

Hell background with surveys burning in the sky

It is said the road to hell is paved with good intentions. In market research, I would say the road to hell is paved with mediocre online surveys. Sure, all surveys from researchers are loaded with good intentions, but many quickly end up in an inferno of faulty data.

I see this often enough, and much of it has to do with continual mistakes that keep respondents away from a heavenly state of mind—whether this occurs with enterprise software or contracting third-party survey providers. No amount of technology can save essential presentation and execution practices that fall squarely on the halos of researchers.

Let’s take a look at these mistakes with online surveys and how they block that stairway to market research heaven.

 

1.   Not Making the Survey Convenient

 

By convenient, we mean offer online surveys in all platforms:

Desktop/laptop
Mobile
Offline (optional but it really depends on the audience)

We’ve explained in depth the advantages of mobile research, including how smartphones and tables are surpassing desktops and laptops when it comes to time spent online as well as searches. In any event, what is key is that you test our survey in all platforms because a survey being functional on a laptop doesn’t mean it will be functional on a smartphone.

 

2.   Not Keeping the Survey Short

 

As an article in Research Access suggests:

Write your survey. Cut it in half. Then cut it in half again. Then maybe, just maybe, you can add a few questions back in.

We would add this: And then ask yourself whether you would you want to take your own survey on any given of your busy days.

If you don’t believe in a painfully short survey—naturally clinging to the idea of wanting to get bang out of your research buck—take a look at this breakdown chart from research on revealing the risks of longer surveys:

Study on online survey respondent fatigue
Lastly, a study found that respondent fatigue sets in at around 20 minutes. This tends to dilute data. On average, each question on a survey takes 20 seconds to complete, with ten minutes a reasonable time (especially on mobile devices because people are, well…mobile). So try to keep your questions under 30 in amount.

If you just need to have a long survey, our research reveals a potential way around this: place the most relevant questions at the beginning. Fair warning, though, as Van Halen sang…

 

3.    No Setting a Time Limit

 

The Research Access piece further states that built-in timers raise the focus of respondents during a task. It explains:

Think about it, if you are trying to finish the survey within 10 minutes, you are concentrating and working hard. You don’t want to drop out because you are in a race with yourself and you want to see if you can meet the challenge.

Here’s a common example:

Picture of an online survey timer bar
 
 
 
 

4.   Not Providing a Progress Bar

 
 
This actually may be called 3.5, but it’s an important issue to address and expand upon in various ways.

Anyway, those of us who have taken surveys know that a progress bar is often a vision of paradise, or at the very least comforting as it measures our procession. Progress bars make respondents feel they are active and goal-oriented in the study. Respondents also tend to view the researchers as transparent and honest.

Much research, including a University of Toronto study, indicates the benefits of a progress bar:

Increased survey satisfaction.
Increase respondent engagement.
An actual desire (75% more) of respondents asking some indication of progress during surveys.

In case you’re wondering:

online survey progress bar

 

 

 

 

Ahh, bet that made you feel good!

 

5.   Not Allowing Respondents to Save and Continue Later

 

Your respondent is halfway done with your questionnaire, and the phone or doorbell rings with the NSA or the ghost of Ed McMahon on the other side. He or she is going to be busy, and thus your survey either dies or it has a chance of waiting.

4

Ed, I predict you ruined that market researcher’s career when you rang the doorbell

 

 

 

 

 

 

 

The choice is yours, and this should be pretty common sense, but you’d be surprised how often surveys aren’t allowed to be continued with some third-party software or oversight of options during programming.

Marketer Ivana Taylor at QuestionPro adds some technical advice on this feature:

There are a few things to keep in mind if you plan to use the Save and Continue option when creating surveys. First, a page break is necessary for Save and Continue, meaning the online survey must be more than one page. Second, the branching and skip logic tools only work during active sessions. As such, they will not function with the Save and Continue option. Finally, randomization logic may not work properly with the Save and Continue option.

For the record, QuestonPro is our sister company, but there was no exchange of souls for this information. It’s just good information!

 

6.   Not Keeping Your Questions Simple

 

Having a short survey does not mean you can get away with byzantine questions, even if they are few. I personally understand that we all have clients or superiors we want to please, or feel we need to get into the corners of every respondent’s cranium. But simple is just better with online surveys.

An example is found in the noted Harvard Business Review article: The One Number You Need to Grow. It documents that the vast majority of customer satisfaction insight comes from answers to one simple question:

How likely is it that you could recommend [X] to a friend or colleague?

In 13 of 14 case studies, this one simple question was as strong a forecaster of customer loyalty as any, more complex question.

Other than that, don’t assume the respondent knows the jargon of your industry.

I don’t have a graphic, so here is some shameless shilling of qSample:
 

advertisement
 
 
 
 

7.   Not Avoiding Yes/No Questions

 
 
This is an old test-writing technique from my days as a teacher. Yes/No questions create vagueness between academic and student. This is widely understood, but often enough it’s not applied to the world of surveys. As one researcher explains:

Yes/no questions don’t capture people who are on the fence or nuances of people’s opinions – in other words, yes/no questions can’t give you the information you need!

As another researcher states, there is a scientific reason for avoiding yes/no questions:

Decades of research in survey methodology and psychology have shown that people generally tend to avoid saying “no.” In the survey context, this is called acquiescence response bias and it is a serious threat to data quality.

Do you understand this? Yes or n…never mind…see picture…

yes no questions graphic
 
 
 
 
 
 

Conclusion

 
 
 
For a bonus and certainly agreed by the rest of the research staff at qSample, I would avoid matrices questions, if possible. Also, make sure to test and edit your survey questionnaire until close to hell freezing over. If you have any issues with progress bars of Save and Continue capabilities, contact your survey software company or online survey provider. If they can’t address these issues, then something is burning and you need to find new clouds.

With all of this in mind, your good intentions will actually be rewarded with good data for your survey research. It’s paradise by the dashboard light, as Meat Loaf sang.

 

Back to Blog
 

Latest insights and trends on market research and surveys

close
Facebook IconYouTube IconTwitter IconVisit Our LinkedInVisit Our LinkedInVisit Our LinkedIn