Tag Archives: best practices

4 e-Newsletter Tips To Ensure Your Company’s Success

In this digital era where businesses are in constant communication with their audience via social media, many feel the e-newsletter is not necessary, or simply go through the motions when sending one out. By doing so, they are missing out on a great opportunity to deliver interesting information in a neat package to their customers or clients. By not being limited to 280 characters, you can expand on topics and provide your audience with an informative and thought-provoking newsletter. That is, if it’s done correctly. We’ve compiled a list of best practices to ensure you are utilizing this personal interaction with your customers or clients.


You are writing this newsletter to your current or prospective customers, and while it may be hard to refrain from selling to your audience, use the 90/10 balance. That is, make your news informative and timely, and not just about your products or services. A good newsletter is 90% educational, and 10% promotional. No matter how much someone loves your products or services, if every interaction with your brand is “buy, buy, buy”, you will have unsubscribers so fast your head will spin.


The design, that is—get your mind out of the gutter. No one wants to have to work to read through clutter and crazy designs. Keep it succinct, and guide the reader through the experience. Typically, there should be an image, a few paragraphs of introduction and a clear call to action.


These days, everyone has software that allows them to measure open and click-through rates. While these are good ways to gauge the success of an e-blast, you shouldn’t focus on these solely. For example, if I receive an email with the subject line: “50% off men’s clothing”, only to find it applies to men’s socks, I will be annoyed and frustrated. While that bait-and-switch tactic resulted in an open, you may have just lost a current or potential customer.


While customer communications can seem more casual today than in the past, grammar, punctuation and spelling all still matter. There’s nothing worse than receiving an email, only to notice there are spelling errors or other simple mistakes. Make sure all of your links work and direct the reader where you want them to go. It may sound obvious, but these are the small things that often slip through the cracks. You want every interaction with your customer to show your company in the best light, and this is any easy one to execute. Get another set of eyes on it, send test emails, and fine-tune until you feel confident it would receive an A from your high school English teacher.

QuestionPro Audience provides our clients with access to more than 22 million active respondents, who are strategically recruited to participate in quantitative research and live discussions. By implementing various recruitment methodologies, we make sure to provide the right kinds of respondents for your research. With industry knowledge and innovative tools, QuestionPro Audience always meets the rigorous demands of our clients. Contact for your next research project.

Best Practices of Questionnaire Design

Conducting research studies are one thing, but conducting a survey that actually generates insightful data is another. A well-written survey can provide the client with invaluable customer insights. While survey methodology may seem intimidating to those who are unfamiliar with market research, it doesn’t have to be. QuestionPro Audience has compiled a list of 5 best practices to ensure you collect the best data from your research study.

SPEAK YOUR RESPONDENT’S LANGUAGE Something that may seem clear to you may be confusing to your respondents. Use simple language and terminology that your respondents will understand, and avoid technical jargon and industry-specific language that might confuse or frustrate your respondents.
Replace: Which of these stores is closest in proximity to your residence?
With: Which of these stores is closest to your home?

AVOID ASKING MULTIPLE QUESTIONS IN ONE It may seem like common sense, but it’s an easy trap to fall into when compiling questions. If any of your questions contain the word “and”, take another look at it. This question likely has two parts, which can tamper your data quality.
Replace: Which of these brands has the best quality and taste?
With: Which of these brands is your favorite?

ASK EASILY MEASURABLE QUESTIONS The key is to ask closed-ended questions that generate data that is easy to analyze and spot trends; not to mention, closed-ended questions are easier for the survey taker. Open-ended questions take longer for the respondent to process and provide a quality response that is concise.
Replace: What do you like about our company?
With: What service(s) of ours do you use?
a) internet b) cell phone c) cable TV

AVOID YES OR NO QUESTIONS People have a tendency to say they like things or agree, even if they don’t actually feel that way. With yes/no questions, you’re not able to capture people who are on the fence or gather any further insight than whether they like something or not. Most yes/no questions can be reworked by including phrases such as how often, how much or how likely.
Replace: Do you like Chicago’s public transportation system?
With: How likely are you to recommend the Chicago public transportation system to a friend?
a) extremely likely b) very likely c) neither likely nor unlikely d) very unlikely e) extremely unlikely

SAVE PERSONAL QUESTIONS FOR LAST Sensitive questions may cause respondents to drop off before completing. If these questions are at the end, the respondent has had time to become more comfortable with the interview and are more likely to answer personal or demographic questions.

Are you ready for your next survey project? We can help! Contact us at sales@questionpro.com.

QuestionPro Audience is the leader in online and offline data collection with access to millions of pre-qualified respondents who participate in thousands of surveys daily. We provide our clients the necessary tools and expertise to conduct 360 degree survey solutions.

What is Response Bias in Online Studies? (and how to slay it)

Business man tempted by angel and devil on his shoulder, signifying response bias

You’ve got it all ready: vetted panel, crafted questionnaire, and an even enterprise software platform from a trusted provider. You’re ready for that data that will forward your market research to blissful success. Everything should be fine, right?

Wrong. A lot can go wrong.

One is response bias, or respondent bias. It’s a major issue in any survey methodology.

The other issue is respondent fatigue—addressed fully in our article Empathy for the Devil that is Writing Survey Questionnaires, which also addresses how to remove this other potential adversity for methods of data collection.

But back to the first issue. What exactly is response bias?

In Personality and Individual Differences, Adrian Furnham states:

Response bias is a general term for a wide range of cognitive biases that influence the responses of participants away from an accurate or truthful response. These biases are most prevalent in the types of studies and research that involve participant self-report, such as structured interviews or surveys.

The Encyclopedia of Survey Research Methods further explains:

Response bias is a general term that refers to conditions or factors that take place during the process of responding to surveys, affecting the way responses are provided. Such circumstances lead to a nonrandom deviation of the answers from their true value. Because this deviation takes on average the same direction among respondents, it creates a systematic error of the measure, or bias. The effect is analogous to that of collecting height data with a ruler that consistently adds (or subtracts) an inch to the observed units. The final outcome is an overestimation (or underestimation) of the true population parameter.

In brief, response bias is the reality that participants bring a lot baggage to surveys (and attempt to hide a lot of this baggage by playing it safe alongside the proverbial pack). Furthermore, respondents have the innate desire to please studies they’re participating in, and therefore have the tendency in answering questions as the researcher might want instead of answering honestly.

Even briefer, humans are humans: complex in public as they are in private.

Yet there are research techniques that can decrease response bias. Some of these procedures paradoxically mean being more human and less scientific!


Don’t Lead Your Respondent



In the law arena, it’s akin to the prosecutor loaded question: “Where were you on the night you murdered your wife?”

In a subconscious way, researchers frequently do the same. It’s also referred to as inherent bias. As one researcher wrote:

For example, a satisfaction survey may ask the respondent to indicate where she is satisfied, dissatisfied, or very dissatisfied. By giving the respondent one response option to express satisfaction and two response options to express dissatisfaction, this survey question is biased toward getting a dissatisfied response.

The answer to this is ensure that your questions are balanced. In addition, verify every questionnaire with other colleagues to always ensure no personal partiality contaminates the study.


Give Them a Way Out



As mentioned, respondents have a natural desire to assist the studies they’re involved in. They also naturally will put pressure on themselves to offer the best possible answers. This pressure may (naturally AND mathematically) create skewed answers. Alleviating this pressure will ensure more honest responses.

An op-out choice is an effective manner way to relive any pressure. As one authority on surveys explained:

That is why it is imperative that every question has an opt-out choice. This is usually in the form of a “Don’t Know,” “Not Sure” or “Undecided.” Not only will adding the opt-out choice eliminate a lot of inaccurate answers from your study, but it will also provide you with valuable information.

The same source states that a notion called “social desirability” is potentially present in surveys—and that is the resistance of respondents to answer sensitive questions due to an intrinsic fear of being exposed to society. The solution is to stress the anonymity of the study beforehand, as well as reminders throughout the survey in the form of text reminders on the screen before a section (as an example).


Offer Questions in a Dynamic Manner



When respondents are presented with a steady pattern of inquiry, they typically answer based on the previous question or subject arrangement. (This can likewise cause respondent fatigue.)

Researcher Sam Mcfarland found that: “When you start with a closed question, you may affect how the respondent will answer a subsequent open-ended question on the same topic because the earlier question has primed them to focus on that issue.”

A White Paper dealing with response bias in surveys, issued by the University of Jerusalem, further stated solutions:

Other ways of circumventing or revealing response bias could be, for example, presenting items on separate screens when using computerized versions, instead of presenting all items simultaneously on one page. The same can be done with traditional PP questionnaires, though the procedure is much more cumbersome. Moreover, the use of a computerized questionnaire enables simple manipulation of the visual presentation of both items and scales. For instance, the computerized questionnaire can present a small number of items simultaneously. Using such techniques might reduce response bias by hindering participants’ attempts to rely on answers to previous items, or on the visual pattern of their answers that is visible when using PP questionnaires. This predicted reduction in response bias is expected to result in lower measures of internal consistency for the computerized versions of questionnaire.

To wit, keep it exciting, mix it up and scatter the topics in the questionnaire.

Most experts on survey procedures, including most of the ones quoted in this piece, agree that these additional techniques can go a long way in mitigating response bias:

–  Use clear and simple language in market research questions.
–  Do not used loaded/lightning rod terms or words unless necessary (e.g.: environmentalist, terrorist, politician, etc.).
–  Avoid negatives like “not,” or at least highlight them so the respondent understands their context.
–  Be as transparent and communicative with the panel throughout the process.

With all if this in mind, response bias can be alleviated to much less than a malady. Just as positive, humans can be humans while research data can become divine for market research.