Tag Archives: surveys

The Looming Online Survey Crisis…and One Way to Avoid It

Woman covering her face at fear of online survey on computer

Online surveys, the final frontier. Okay, maybe they’re not, but they seem to be as endless as the stars these days—with every brand asking for feedback after a purchase or service.

That’s not surprising. Online research has become the preferred data collection method for market research. The Internet and Mobile Eras have made it easier and cost-effective to reach customer bases.

What also might not be surprising is that online surveys may be faltering as a research method, and the solutions have to do with what we do here at qSample.

With that semi-shameless plug out of the way, the predicament of online surveys is presented in a penetrating article by David R. Wheeler in The Week.

The piece is entitled The rising revolt against customer surveys. Here are some of the startling takeaways hinting to trouble in online survey heaven:

According to one study, phone survey cooperation rates have plummeted from 43 percent in 1997 to 14 percent in 2012. Online survey responses rates are even lower, with some studies showing participation rates averaging 2 percent.

Wheeler’s research is partly supported by a recent Pew Research Center study. It states that telephone poll response rates fell from 36% in 1997 to 9% this decade. This is highlighted in this graph:

graph of telephone poll participation

So what’s the problem? At first, one might think the issue is oversaturation, as so many companies can afford online surveys. Wheeler even says:

In more recent times, as scientific methods developed, surveys became the data collection method of choice for government, academia, and businesses. Response rates peaked between 1960 and 1990, followed by a precipitous fall, partly because of the exponential rise in survey requests in the internet era. One survey firm, ForeSee, conducts about a million surveys per month. Meanwhile, Mindshare Technologies conducts 60 million surveys every year — that’s a mind-boggling 175,000 per day.

With participation rates dropping and online surveys spamming, what is occurring is that only respondents in the extreme spectrums are engaging. If only those who truly love or deeply hate your brand are sharing their viewpoints, logically it’s going to hinder survey data.

However, oversaturation really can’t be blamed. After all, everything is saturated on the Internet. That hasn’t harmed companies with determination and astuteness.

The problem, as Wheeler sees and we agree, is that a lack of incentives is preventing online research the ability to cull respondents in the middle of spectrums.

Simply put: if you don’t give people a reason to participate, they simply won’t, especially in a crowded online field. Many people these days suffer from what is known as “feedback fatigue,” where they’re weary and numb of being asked to judge a brand or service after a purchase.

Marketers rationalize that improving their brand for the customer is enough incentive to take online surveys. But that’s obviously not flying anymore by the data.

Wheeler also posits that higher management has become too enamored with machine-gun quantitative research, always trying to coax the right numbers from studies before the next board meeting or end of the quarter. Additionally, he doesn’t see the “drawing prize” incentive as particularly effective. I mean, does anyone know anyone who has won the $2000-in-products drawing for participating in a CVS online survey?

In the end, though, incentivizing is key and will surely improve market research. Studies have shown that incentives “will typically lift response rates by 10-15%.”

Furthermore, an article from the Marketing Research Association states:

Careful experimentation has shown that respondent incentives can improve a survey’s quality and efficiency. In particular, incentives can:

Improve the response rate: Incentives have been found to significantly increase the proportion of the sample that is willing to participate, ensuring a large enough sample to draw generalizations.
Improve response from hard-to-reach groups. Incentives can get the attention of individuals who would normally not participate in the survey, significantly improving the survey’s measurement quality. In particular, they can help improve representativeness in survey response from low-population rural areas — essential to the inclusion of rural voices in federal decisions.
Increase efficiency. Incentives reduce costly and time-consuming non-response follow-up, which can include dozens of call-backs and in-person visits, and may even reduce the study’s cost.

In other, more uncomplicated words: reward and have your research rewarded.

This doesn’t mean offer respondents an Anston Martin for each questionnaire participation. There are sensible rewards a company can offer. They just take some imagination. For example, Starbucks often rewards Gold Member respondents with a bonus Star that gets them closer to a complimentary drink.

Or make sure to utilize an online survey provider that maintains incentivized sample.

At qSample, we continually recompense our panels. They are recruited voluntarily through various online and conference means. They aren’t asked to participate for the “good” of anything, nor will they ever have to worry about a drawing for some smartphone or Amazon gift card. Their time is valuable, and so should the time of any of your respondents—whether you conduct research in-house or contract an online survey provider.

Time is the most valuable commodity we have. Make your respondents’ time worth it with incentives. That will be a final frontier you’ll enjoy reaching.

Contact Us

Infographic of the Week: Putting Together the Survey Puzzle

Hand with floating puzzle pieces representing market research

Market research is a $24 billion a year business. It employs more than 150,000 workers nationwide. Online research has become the preferred method for research in this industry, and online surveys are an integral component.

Needless to say, it’s crucial for an online survey to be as complete as possible before it self-actualizes as raw data for ravenous researchers. Surveys may not fall under the onion-motif of Shrek, but are more like a delicate puzzle that can provide a complete picture for various studies. A missing piece can mean incomplete information or faulty responses.

The latest qSample infographic deals with this sometimes puzzling issue. Please feel free to download at the bottom for your profit. We hope we are supplying yet one more piece to your market research success.

Survey Puzzle

Case Study Why Consumers Buy Green

Download this infographic.

Embed Our Infographic On Your Site!

Infographic of the Week: The Pros and Cons of Online Surveys

Keyboard with word survey in place of return

Online research has become the chief avenue for marketers in their eternal quest of understanding consumer or client. The cost-saving capabilities of the internet no longer mean sacrificing data in an era of more efficient algorithms and encompassing enterprise software.

But perfection is an algorithm of the enterprise software that is the human mind. Thus, qSample’s latest infographic deals with some of the perils and opportunities of online surveys. It’s based on our article The Pro’s and Con’s of Surveys.

This is as much about transparency as it is about overcoming issues with online research. We hope you the infographic find useful.

The Pros and Cons of Online Surveys


Download this infographic.

Embed Our Infographic On Your Site!

7 Hellish Ways You’re Burning Your Online Survey Respondents

Hell background with surveys burning in the sky

It is said the road to hell is paved with good intentions. In market research, I would say the road to hell is paved with mediocre online surveys. Sure, all surveys from researchers are loaded with good intentions, but many quickly end up in an inferno of faulty data.

I see this often enough, and much of it has to do with continual mistakes that keep respondents away from a heavenly state of mind—whether this occurs with enterprise software or contracting third-party survey providers. No amount of technology can save essential presentation and execution practices that fall squarely on the halos of researchers.

Let’s take a look at these mistakes with online surveys and how they block that stairway to market research heaven.


1.   Not Making the Survey Convenient


By convenient, we mean offer online surveys in all platforms:

Offline (optional but it really depends on the audience)

We’ve explained in depth the advantages of mobile research, including how smartphones and tables are surpassing desktops and laptops when it comes to time spent online as well as searches. In any event, what is key is that you test our survey in all platforms because a survey being functional on a laptop doesn’t mean it will be functional on a smartphone.


2.   Not Keeping the Survey Short


As an article in Research Access suggests:

Write your survey. Cut it in half. Then cut it in half again. Then maybe, just maybe, you can add a few questions back in.

We would add this: And then ask yourself whether you would you want to take your own survey on any given of your busy days.

If you don’t believe in a painfully short survey—naturally clinging to the idea of wanting to get bang out of your research buck—take a look at this breakdown chart from research on revealing the risks of longer surveys:

Study on online survey respondent fatigue
Lastly, a study found that respondent fatigue sets in at around 20 minutes. This tends to dilute data. On average, each question on a survey takes 20 seconds to complete, with ten minutes a reasonable time (especially on mobile devices because people are, well…mobile). So try to keep your questions under 30 in amount.

If you just need to have a long survey, our research reveals a potential way around this: place the most relevant questions at the beginning. Fair warning, though, as Van Halen sang…


3.    No Setting a Time Limit


The Research Access piece further states that built-in timers raise the focus of respondents during a task. It explains:

Think about it, if you are trying to finish the survey within 10 minutes, you are concentrating and working hard. You don’t want to drop out because you are in a race with yourself and you want to see if you can meet the challenge.

Here’s a common example:

Picture of an online survey timer bar

4.   Not Providing a Progress Bar

This actually may be called 3.5, but it’s an important issue to address and expand upon in various ways.

Anyway, those of us who have taken surveys know that a progress bar is often a vision of paradise, or at the very least comforting as it measures our procession. Progress bars make respondents feel they are active and goal-oriented in the study. Respondents also tend to view the researchers as transparent and honest.

Much research, including a University of Toronto study, indicates the benefits of a progress bar:

Increased survey satisfaction.
Increase respondent engagement.
An actual desire (75% more) of respondents asking some indication of progress during surveys.

In case you’re wondering:

online survey progress bar





Ahh, bet that made you feel good!


5.   Not Allowing Respondents to Save and Continue Later


Your respondent is halfway done with your questionnaire, and the phone or doorbell rings with the NSA or the ghost of Ed McMahon on the other side. He or she is going to be busy, and thus your survey either dies or it has a chance of waiting.


Ed, I predict you ruined that market researcher’s career when you rang the doorbell








The choice is yours, and this should be pretty common sense, but you’d be surprised how often surveys aren’t allowed to be continued with some third-party software or oversight of options during programming.

Marketer Ivana Taylor at QuestionPro adds some technical advice on this feature:

There are a few things to keep in mind if you plan to use the Save and Continue option when creating surveys. First, a page break is necessary for Save and Continue, meaning the online survey must be more than one page. Second, the branching and skip logic tools only work during active sessions. As such, they will not function with the Save and Continue option. Finally, randomization logic may not work properly with the Save and Continue option.

For the record, QuestonPro is our sister company, but there was no exchange of souls for this information. It’s just good information!


6.   Not Keeping Your Questions Simple


Having a short survey does not mean you can get away with byzantine questions, even if they are few. I personally understand that we all have clients or superiors we want to please, or feel we need to get into the corners of every respondent’s cranium. But simple is just better with online surveys.

An example is found in the noted Harvard Business Review article: The One Number You Need to Grow. It documents that the vast majority of customer satisfaction insight comes from answers to one simple question:

How likely is it that you could recommend [X] to a friend or colleague?

In 13 of 14 case studies, this one simple question was as strong a forecaster of customer loyalty as any, more complex question.

Other than that, don’t assume the respondent knows the jargon of your industry.

I don’t have a graphic, so here is some shameless shilling of qSample:


7.   Not Avoiding Yes/No Questions

This is an old test-writing technique from my days as a teacher. Yes/No questions create vagueness between academic and student. This is widely understood, but often enough it’s not applied to the world of surveys. As one researcher explains:

Yes/no questions don’t capture people who are on the fence or nuances of people’s opinions – in other words, yes/no questions can’t give you the information you need!

As another researcher states, there is a scientific reason for avoiding yes/no questions:

Decades of research in survey methodology and psychology have shown that people generally tend to avoid saying “no.” In the survey context, this is called acquiescence response bias and it is a serious threat to data quality.

Do you understand this? Yes or n…never mind…see picture…

yes no questions graphic


For a bonus and certainly agreed by the rest of the research staff at qSample, I would avoid matrices questions, if possible. Also, make sure to test and edit your survey questionnaire until close to hell freezing over. If you have any issues with progress bars of Save and Continue capabilities, contact your survey software company or online survey provider. If they can’t address these issues, then something is burning and you need to find new clouds.

With all of this in mind, your good intentions will actually be rewarded with good data for your survey research. It’s paradise by the dashboard light, as Meat Loaf sang.


To Understand Market Research You Must Do Yoga

One will find CEOs, entrepreneurs, and Hollywood stars all turning to a new trend: yoga and meditation. After years of skepticism, only recently is this trend seen as a holistic way to cure some of the world’s most common ailments. Yoga has shown to help reduce stress, depression, anxiety, and in some cases chronic back problems. This alleged mysticism has certainly made me a better professional—and I’m not alone in this as yoga is not alone in being backed by medical science.

According to the Sports & Fitness Industry Association, more than 24 million U.S. adults practiced yoga in 2013, up from 17 million in 2008 (making it roughly as popular as golf). Many companies are offering classes to help their employee’s de-stress. In addition, many physicians are recommending it as a pain management treatment.

Yoga can even be looked at as a way to understand market research. Take a minute and get some insight on big market research topics and how one can connect the mind, body, and soul to it.

“I mean the whole thing about meditation and yoga is about connecting to the higher part of yourself, and then seeing that every living thing is connected in some way.” –Gillian Anderson


Analyzing The Body

In yoga, one must perform a series of poses. They range from as easy as lying down to as complex as trying to get the feet to touch the forehead while holding a handstand. In all poses, one must always be aware of the body. An individual must analyze his or her body in order to find the depth of a stretch or to what extent the pose can be performed. Yoga poses are deliberate and are meant to push the body to its limits

This is akin to the main findings in big data. When conducting an analysis one must be sure that the data files are consistent with one another. Any inconsistencies (e.g. numbers included) should be explained.

Analyzing data is as well a priority in market research, as it grants the necessary answers to benefit the “body” that is a business. As one does with bodies in yoga, one must also analyze the data, test it to its logical limits, and decipher to results for further “workouts.”


Catching The ‘Trap Questions” in The Busy Mind

Meditation and yoga go hand in hand in their goal of a clear path of reasoning. In the typical yoga class, one will persistently hear the instructor coo, “Focus on your body and let everything else go.” Much like trap questions in surveys, this is done so to refocus participants and help keep them on task. Trap questions are safeguards in the form of unrelated questions, sprinkled at certain intervals of the survey. This hopes to adjust the focus of respondents or remove those who have no interest in providing usable data. Like trap questions, yoga uses meditation before and after class to help students focus their attention on their movements.

“I was beginning a journey learning more about myself and, surprisingly, more about business than I learned at one of the top ten business schools in the country and 20 years of professional experience.”

 Mark Hughes, CEO of C3 Metrics, on the benefits of yoga


Removing Respondent Bias in The Ego

In Western culture, the word ego refers to characteristics that often make someone seem intolerable in the eyes of others, such as arrogance, selfishness or and an inflated sense of self-importance. One of the goals of yoga is to be able to control and point out when the ego surfaces. When conducting surveys, removing respondent bias is important to gain useful data.

One of the most common biases in surveys (like in our ego) is social desirability. People like to present themselves in a favorable light. They will be reluctant to admit to unsavory attitudes or illegal activities in a survey. Instead, their responses may be biased toward what they believe is socially desirable. Yoga attempts to cut that by preaching that inner acceptance and control of the ego is part of the yogic process.


Eliminating Fatigue in The Body

After a day chained to a desk and dealing with outside stresses, yoga is often used to revitalize the body. An evening performing yoga helps release the day’s tensions and fatigue. Like yoga in market research, it is paramount to eliminate respondent fatigue in the survey experience.

Getting information from the participants is important. However, one cannot overload respondents with surveys with long questionnaires, long matrices question blocks, or complicated wording.  While in yoga, the body is always aware of its aches and pains as it stretches and relaxes.

As market research experts, we must also be aware of our participants’ aches and pains throughout surveys or risk fatigue. Yoga, according to the Center for Complementary and Integrative Health, might improve quality of life by:

  • reducing stress
  • lowering heart rate and blood pressure
  • help relieve anxiety
  • depression
  • insomnia
  • improve overall physical fitness, strength, and flexibility

With these issues in mind, one can conclude that yoga helps eliminate fatigue, just as researchers should avoid fatiguing their participants.

There is a reason why almost 10 percent of Americans have turned to yoga. Many people like me in high-stress jobs tend to turn to this as a method to reduce stress and to add to their overall physical fitness.  There are claims after claims that yoga helps connect oneself with the world, and part of that world may involve sessions of meditative market research.

6 Respondent Supervillains That Destroy Your Survey Data

Getting the best possible survey data is a heroic endeavor for anyone performing online research. It makes for a better market and academic world. Sadly, there are those respondents who impede this adventure, spawned from the nether regions of River Sample™ and causing bad survey data. By viewing these respondents through the lenses of archetypal villains, a researcher’s champion task is not only easier but perhaps even entertaining (even the most stoic of us researchers get blurry-eyed glancing at survey data all day).

Get your Data Superhero tights on, because here are the six Survey Supervillains you need to watch for:

The Speeder1. The Speeder: The name says it all, and this Quicksilver or Flash-persona blasts through surveys at rates that defy logic (or more like enforce logic, since they’re obviously not paying attention to questions). The Speeder is easily detected because he or she finishes surveys in less than 30%-50% of the median time. For a more technical way to defeat the Speeder, an article on Research Access recommends:

Check the median time to completion and establish rules that you feel comfortable with – I often flag those taking <1/3 of median time with a “1” (“speeder”), and those taking < 1/4 of the median time with a “2” (“super speeder”).  You might consider removing outliers (at the slow end) before calculating your median.

Yes, there is a Super Speeder too! No superhero saga is complete without reboots and higher-level bosses.

The Flat Liner2. The Flat-Liner: Also known in as the Straight-Liner, the Flat-Liner is the grim reaper of market research, the Thanos of survey methodologies. When reading data, this Survey Supervillain’s death-mark manifests as the same line answered each time. The fiend is just not paying attention or has chosen a specific line in order to quickly complete a survey. The Flat-Liner may take the shape of the Speeder to take you off your superhuman game, but they typically finish at an average rate. One market researcher proposes a solution to defeat (or at least nudge) the dreaded Flat-Line:

For this reason it is important that any multi-numeric list used to measure straight-lining has questions that one would reasonably expect respondents to have varying opinions on (e.g. asking a respondent to agree with both positive and negative statements about the same product).

The Lord of Silence3. The Lord of Silence: Not communicating in surveys is doom for survey research, but easy to remove. However, claptrap or sloppy answers are just as dangerous yet harder to detect. That’s what the Lord of Silence does. A tell-tale sign this Survey Supervillain is invading your research is time discrepancies—such as claiming he/she was born in 1997 but then saying they have been at their current job for 20 years.

A data research blogger further advises: “Scan through open-end responses (and other-specify responses) for gibberish or excessively vague and short answers.”

Bizarro Grinch4. Bizarro Grinch: Yes, some Survey Supervillains hurt you by leaving loot. Bizarro Grinch will leave a Christmas-Tree pattern in the questionnaire multiple choice section. Often, this nemesis gets cuter and leaves other “artwork” shapes in the survey. Unlike his associate, the Flat-Liner, it’s harder to expose this Survey Supervillain when viewing a pattern on an Excel or other spreadsheet. As one researcher explains:

Christmas-Tree responses are a little more difficult to find, however, if you’ve used numeric reporting values, you should be able to easily spot ascending or descending patterns in Excel and remove these responses.

Take this mission with all of your heart, as this is worse than a pair of Old Navy socks under the tree on December 25th!

The Duplicator5. The Duplicator: Similar responses may appear in a survey, and that could mean this Survey Supervillain is committing a heist in your survey data. He or she is the sibling of the Lord of Silence. On the other hand, doppelganger information can frequently be the infamous maxim of “We have met the enemy and the enemy is us.”

In other words, there is a glitch in the software replicating data, or a respondent is taking the survey a second time. Your Spidey Senses should be clear that this replica data needs to be destroyed.

The Revolutionary6. The Revolutionary: Sadly, the Revolutionary is one your own citizens revolting because you didn’t keep the peace as the superhero in your survey neighborhood. Essentially, you didn’t write the best possible questionnaire, and thus unlocked a Zombie Apocalypse of respondent fatigue and respondent bias (Hint: both links grant the kryptonite you need to repeal these uprisings, or at least provide the best survey questions and survey sampling).

As with any responsible champion of good data, your task will be to win them over again.

If you follow these steps (before they autodestruct in five seconds), you will be a Data Superhero. In reality, panel providers these days offer excellent sample. Yet as the Research Access piece stated it’s inevitable that “1 and 5% of survey data from panel sample is garbage.”

It’s just the way it is. Yet a simple gallant inspection and a few Superman techniques can ensure you rout these percentages rather quickly. After all, it’s widely known that 70% of online surveyors clean their data before analyzing.

Afterward, it’s always prudent to inform panel provider of these discrepancies. Then they can make sure to minimize Survey Supervillains or glitches like Duplicators. Forward them the ID’s of the respondents you banished from your study Metropolis. They’ll take action on their side to warn and/or remove these panelists from their database (as we do every week at qSample).

Furthermore, many Survey Supervillains are actually computer bots. It’s just part of the internet to encounter them. Panel providers can tweak software to eliminate the Rise of the Machines.

With that in mind, you can continue with your research to become survey savvy and a true Data Superhero. Your data results will look more like The Avengers and less like The Watchmen.

button white papers

Infographic of the Week: The Advantages of Mobile Research

Online research is still the dominant data collection mode of choice. Yet as we enter the Mobile Era, mobile research is gaining favor with researchers for many purposes, beyond being suitable for segmentation or specificity projects. For many reasons we’ve detailed, mobile research should be considered as part of any qualitative or quantitative research strategy.

That is the theme of this week’s infographic. We hope you enjoy. In addition, at the bottom of the infographic we’ve included a video of an episode of Research Business Daily ReportFocusVision, where host Bob Lederer mentions our primary findings on…you got it…mobile research.

The Advantages of Mobile Research
button white papers


Download this infographic.

Embed Our Infographic On Your Site!

New Studies Claim Survey “Trap Questions” are Questionable for Market Research

Monty Python knights screaming at floating question marks that are "trap questions"

Two new studies indicate the conventional methodology of “trap questions” for online surveys may not be as effective as originally supposed. In fact, trap questions might have unforeseen results, according to both studies, and that is a notion that tends to unnerve the methodical market research industry.

The findings came from a pair of University of Michigan studies on instructional manipulation checks, or IMCs. Both concluded that answering trap questions may alter the way people respond to subsequent questions in a survey.

For those not entirely familiar with IMCs, they are principally the same as trap questions (or sometimes called “attention checks”). In essence, not all survey respondents will pay sustained attention to questions—or even follow instructions—effectively blazing through a questionnaire. These respondents, therefore, tend to dilute survey data.

Consequently, it’s not uncommon for researchers to place safeguards in the form of unrelated questions or instructions at certain intervals of a survey. This hopes to calibrate the focus of respondents or cull those who have no interest in providing usable data.

Here is an example from a social scientist:

So, in order to demonstrate that you have read the instructions, please ignore the sports items below. Instead, simply continue reading after the options. Thank you very much.

Which of these activities do you engage in regularly? (Write down all that apply)

1)    Basketball

2)    Soccer

3)    Running

4)    Hockey

5)    Football

6)    Swimming

7)    Tennis

Did you answer the question? Then you failed the test.

Another, perhaps more approachable example as it’s found in popular culture, would be in Monty Python and The Holy Grail, in the scene where the magical Bridgekeeper tests King Arthur and his knights with a series of questions. The right answers test the mettle of the knights, thereby allowing them to pass across over the Bridge of Death and get closer to finishing their hallowed quest:

Bridgekeeper: Stop. What… is your name?

Galahad: Sir Galahad of Camelot.

Bridgekeeper: What… is your quest?

Galahad: I seek the Grail.

Bridgekeeper: What… is your favorite color?

Galahad: Blue. No, yel…

Galahad is then thrown over the bridge into the Gorge of Eternal Peril. He was a respondent attempting to blaze through the Bridgekeeper’s questionnaire.

Whether it’s Galahad or respondents sketchily answering about their sports hobbies, the two studies from the University of Michigan point that the thinking of respondents may be modified following an IMC or trap question

In the first study, subjects received a trap question in a math test. Half of the participants completed the trap question before the math test, whereas the other half completed the math test first. Researchers found in this study that completing a trap question first increased subjects’ analytical thinking scores on the math test.

In the second study, subjects also received the trap question in a reasoning task assessing biased thinking. As with the prior test, half of the participants finished the trap question before the reasoning task— while the other half completed the reasoning task first. The researchers discovered that completing the trap question first decreased biased thinking and caused more correct answers. Hence, completing a trap question made subjects reason more systematically about later questions.

All of this, as the lead researchers pointed out, indicates that many past studies may have been affected by IMCs. It’s suggested that deeper thinking may not always be the best state for a respondent during a survey. Instead, an optimal thinking state is where respondents are reasoning as they normally would in daily life. As more research is conducted on the efficacy of IMCs for survey research, it might be in order to focus more on other traditional safeguards such as mitigating response bias or response fatigue.

It should be noted that neither of the two studies points to any alarming suppositions of past research. In other words, these findings should not unseat market research from its continued quest over bridges of river sample to the Holy Grail of the best possible data.

Market research just has to be persistently vigilant that it and its respondents are in the right thinking to remember their favorite colors.

button white papers

Infographic of the Week: The Psychology of Mobile Consumers

Lady holding up smarphone with numbers exploding from it

It’s no secret we champion mobile research for the betterment of market research (as it’s no secret mobile research is one of our specialties; we regularly partner with companies like Microsoft for smartphone surveys or focus groups). We recently even wrote how mobile technology, as a medium, transforms the very psychological architecture of respondents.

But surely some in the industry might feel the mind of the mobile consumer is still somewhat of a secret. This week’s infographic addresses this issue and hopefully offers insights on the blissful union of market research and mobile technology.

After all, as more individuals choose mobile technology over personal computers, market research migrates that way as well. As a white paper on Research stated:

Mobile growth in online surveys is mirroring overall growth in mobile access to the internet with survey starts on mobile and tablets rising from less than 10% of survey starts in 2011 to more than 25% of survey starts in 2014 according to 2014 Trends Report.

Please enjoy and as alway please have a wonderful and mobile end of the week.

Mobile Consumer Psychologybutton

Download this infographic.

Embed Our Infographic On Your Site!

General Contractors and Mobile Technology (qSample Study)

General contractor holding up tablet with qSample logo in it

General contractors have embraced the mobile era like the rest of the nation, although their usage of mobile technology as consumers remains dissimilar to the general population. This and more are the findings of a recent qSample study concerning general contractors and mobile devices. The research was conducted using qSample’s general contractor panel. The survey was performed during the first week of May, with more than 200 general contractors across the country.

When it comes to mobile preference, general contractors are quite the Apple fanboys (or girls… though 96% of general contractors are male, according to the study). With 90% of respondents claiming they owned smartphones, 58% of those possessed an iPhone. 36% surveyed owned Android platform smartphones and only five percent named Windows phones as their preferred brand.

When it comes to tablets, 56% of general contractors owned these devices. Of those, 65% chose iPads as their brand. Once again, Android platform tablets came in second with 23%. Windows Surface tablets took third in preference with seven percent, leaving Kindle Fire tablets in last place of the sample with four percent.


For what purpose mainly do general contractors use mobile devices?


A large majority (71%) of general contractors admitted employing their smartphones for both business and personal reasons—with 23% using them for business only and five percent for personal only. For tablets, a majority (47%) of general contractors employ tablets for work, with 34% using tablets for personal use only (and eight percent mainly for reading).

Here is the breakdown for smartphone usage:

For personal use:

1. Making calls: 72%
2. Checking email: 14%
3. Texting: 10%
4. Other: 3%

An interesting, if not vexing number for marketers, is that no respondent claimed to use smartphone mainly for personal shopping in the survey, and only one percent claimed to surf the web for leisure.

For professional use:

1. Business calls/emails 42%
2. Business texts 18%
3. GPS or maps for directions: 18%
4. Business-related apps 12%
5. Assistance for supply shopping: 9%

In this respect, the lack of mobile technology for the purpose of business shopping is not surprising. As our past research demonstrates, general contractors utilize mobile devices for purchasing personal or business products far less than general consumers.


Do general contractors market online?


Based on the study, only 26% of respondents surveyed admitted employing online forms of advertising and marketing. Of those, these were the primary methods:

1. Google ads campaigns: 34%
2. Social Media networking: 24%
3. Social Media advertising: 18%
4. Using services of third party marketing company: 18%

Many marketers have proclaimed the era of banner advertising as dead. General contractors seem to agree with this, as only seven percent employed this form of marketing as an option.


Why are general contractors not as involved online as other demographics?


The reasons why general contractors utilize mobile devices widely but are still not as adept as the rest of the population, may have to do with the amount spent online daily, as the study revealed:

Less than an hour: 33%
1-2 hours: 46%
3-4 hours: 19%
4-6 hours: 15%
More than 6 hours: 6%

(One should consider that the average person spends more than six hours a day online.)

Of the amount of time spent online, a larger amount was used primarily for work-related tasks (70%). After that, it was reading the news (13%). Only six percent of general contractors used mobile devices primarily for personal shopping, with the same percentage for leisure/entertainment. This points to less than the national average when it comes to shopping. According to statistics garnered from Forbes: “74 percent of people use their mobile phone to help them while shopping, with 79 percent making a purchase as a result.”

This indicates that general contractors are judicious about their time spent on mobile devices, and online in general. The apparent reason, as we have mentioned, is that general contractors are themselves mobile and thus do not invest as much time online as other, more sedentary sectors of society. Furthermore, general contractors are extremely brand loyal and habitual when it comes to shopping, as per our past studies found in General Contractors & Brand Loyalty. The research also reveals that a majority of general contractors (29%) surveyed are influenced in shopping by the opinions of peers, and not forms of marketing and advertising.

From a market research or marketing standpoint, this obviously provides to challenges. General contractors build the world around us, but if you build a digital marketing strategy for them, it doesn’t mean they will come.

General contractors button