12 customer survey bias types

12 Types of Bias in Customer Surveys and 12 Smart Ways to Minimize Them

The best way to improve customer retention and foster brand loyalty is to ask your customers about ways in which you can serve them better. At the same time, it’s important that you collect accurate customer insights and feedback, as misleading data could lead to wasting valuable time and money. 

Today we’ll dive into one of the major causes of misleading data – customer survey bias. There are two main types of customer survey bias you should be aware of so that you don’t fall into the trap of basing major business decisions on skewed survey results:

  • Selection (sampling) bias, where the results are skewed a certain way because you’ve only captured feedback from a certain segment of your users.
  • Response bias, where there’s something about how the actual survey questionnaire is constructed that encourages a certain type of answer, leading to measurement error.

Let’s start by looking at five major types of selection bias, their causes and how to minimize their impact on your survey results.

Selection (Sampling) Bias

Selection or sampling bias happens when some users are systematically more likely to be selected than others. The problem with sampling bias is that findings from these feedback surveys represent the opinion of a limited group of users.

Imagine that you’ve surveyed 2000 diligent students who visit the learning management system you’re responsible for on a daily basis. The sample will be heavily biased because it won’t be diverse enough to paint the whole picture. It leaves out other groups of users (e.g. students who don’t often log in to the system) that are necessary to draw an accurate conclusion.
Causes of selection bias

Selection bias can happen in both probability and non-probability sampling. In probability sampling, every user of your product has a known chance of being selected. For instance, you can use a random number generator to select a simple random sample from your list of users.

Although this procedure reduces the risk of sampling bias, it may not eliminate it. If your sampling frame – the actual list of individuals that the sample is drawn from – does not match the user base, this can result in a biased sample. For example, if you are looking to draw conclusions about all of your customers, you will need to share your survey in such a way that all people have an equal chance of responding to the survey. If you only share a link to your survey via social media, it’s very likely you are biasing your results. Chances are, not all of your customers have social media accounts. Furthermore, only a fraction of those that do will have liked or followed your company page.

A non-probability sample is selected based on non-random criteria, and not every user has a chance of being included. For instance, you could be looking for very specific feedback on a new reporting feature your development team rolled out recently, and need input only from users who’ve tried it already. Or, you’re trying to improve your onboarding experience, so you really only want feedback from those who have gone through the entire onboarding experience of your product. If that’s the case, in your survey report, it’s important to describe the reason why these users are the best sample for your survey and what value your team received from these results.

5 types of selection bias

While working on surveying your users you may encounter one of these selection bias types.

TypeExplanationExample
Self-selection BiasUsers with specific characteristics are more likely to respond to feedback surveys.Your brand promoters are more likely to respond to your surveys because they’re excited about your product. 
Non-response BiasUsers who don’t respond to surveys or emails systematically differ from those who take part.Users who visit your product once a month are less likely to respond to your survey because they don’t feel an affinity with your product.
Undercoverage BiasSome users are inadequately represented in your feedback survey sample.When launching your feedback campaign you may miss customers that never log in to your product because, for example, they’re only looking at the reports generated by your product during monthly team meetings.
Survivorship BiasSuccessful observations are more likely to be presented in a sample than unsuccessful ones.In a team meeting you’re more likely to talk about a survey that was full of helpful insights than about the one that didn’t bring significant results.
Pre-screening or advertising BiasThe way users are pre-screened or where a study is advertised may bias a sample.If you’ve reached out to your users only via email before launching your feedback campaign. There’s a probability that the survey will be taken mostly by those who’ve read your email.

Here are 5 ways  you can minimize selection bias

Completely avoiding sampling bias is close to impossible but you can do your best to minimize it. Here are a few steps you can take when working on your next customer survey campaign:

  1. Clearly define your survey goals.
  2. Share your survey on as many channels as possible: inside your product, your website, email, social, messengers, QR codes in your store or office.
  3. Keep your survey short and engaging.
  4. Make sure your survey is mobile-friendly.
  5. Send a follow up message on all channels.

Response Bias

Response bias is when a bias affects the response you get from a user. For example, if you were running an e-store, you might ask your customer: “Did you like your T-shirt?

There is a good chance that the person responding will say that they liked it even if it wasn’t the truth. This little white lie, to avoid confrontation, is response bias.

Response bias negatively impacts the quality of survey results. Continuing with the example above, if the e-shop manager doesn’t know that their t-shirts’ quality is far from perfect, they will see a gradual decrease in purchase orders without understanding the reason why. 

As you can see, response bias has the power to impact business operations directly, that’s why Amazon is so meticulous when it comes to feedback collection.

Causes of response bias

You may encounter response bias in a range of situations. It could be that your respondents don’t understand the question, it might be that they responded to the questions in a noisy environment which made it hard for them to concentrate, maybe they want to portray themselves in a favorable light, or they simply don’t have the time to answer your questions thoughtfully.

Below are the seven most common response bias types along with examples for each.

7 types of response bias

TypeExplanationExample
Demand Characteristics BiasParticipants tend to predict a survey’s goals and respond in ways that correspond with those goals.Q: How often do you login to our product? 
A: Every dayB: Several times a weekC: RarelyD: Never
Social Desirability BiasWhen it comes to sensitive topics (procrastination, gaming, unhealthy habits), people tend to answer questions in a way that will make them look good.Q: How often do you have your car cleaned?
A: Every day
B: Several times a week
C: Once a week
D: Once a month
E: Once a quarter
F: Twice a year
G: Once a year
Acquiescence BiasThis happens when your users choose to respond positively to every statement or question in your survey.Q: Are you following us on Linkedin?
A: Yes
B: No
Dissent BiasThis happens when participants choose to respond negatively to every statement or question in your survey.Q: Have you tried our new product feature?
A: Yes
B: No
Extreme Responses BiasRespondents answer survey questions with answers on the extreme end of the options list. Their answers could be positive or negative. Q: How would you rate your experience with our mobile app?
A: Extremely satisfied
B: Somewhat satisfied
C: Neither satisfied nor dissatisfied
D: Somewhat dissatisfied
E: Very dissatisfied
Neutral Responding BiasThis type of bias occurs when participants give neutral answers for most questions. It happens because users experience survey fatigue or don’t have the time to provide thoughtful answers to your questions.Q: What has been your experience with our customer success team?
A: Extremely satisfied
B: Somewhat satisfied
C: Neither satisfied nor dissatisfied
D: Somewhat dissatisfied
E: Very dissatisfied
Question Order BiasThis happens when respondents are “primed” by the context of a previous question, which affects their answer to subsequent questions. They respond to later questions exactly as they answered the first to remain consistent across the survey, or because the first question made them think about the issue in a different way.Priming question: How happy are you with the [specific feature] of our product?
Subsequent question: How happy are you with our product?

7 ways you can minimize survey response bias

  1. To avoid demand characteristics bias, disclose as little as possible about the purpose of your study to your users. 
  2. Allow anonymous responses to reduce social desirability bias and use neutrally-worded questions. Whenever possible, check responses against what you know about your users from previous answers or your existing customer data.
  3. Acquiescence bias can be minimized by varying the types of questions you ask. Add multiple choice and open-ended questions in addition to scale questions. Also, avoid “yes” and “no” questions as they can’t provide insights or perspectives.
  4. Use the tactics we described above to tackle dissent bias. At the same time, keep your surveys short to avoid survey fatigue.
  5. The best way to mitigate extreme responses bias is to carefully structure each survey question and to make the responses anonymous to give respondents the freedom to express themselves honestly.
  6. Keeping your survey short and weaving in open-ended questions will help you avoid neutral responding bias in your survey results.
  7. You can limit question order bias by testing it with an eye for priming with your team first. Start with general questions first and then move to specifics. Always randomize question order.

Data Analysis Experts Recommend

If you really want to know your customers, use every chance to talk to them

To dive deeper into the subject, we’ve interviewed Masha Kubyshina Salvado who is the Technology and Data Director at Accelerate Change Network.

To minimize survey bias, ask yourself: 

  1. What’s the goal of my customer survey?
  2. What am I expecting to learn from my users?
  3. What does success look like for this survey?

The answers to these questions will help you compare your findings to your expectations. If they match, then you know your customers, if not, it’s time to go back to the drawing board and adjust.

Masha also recommends to:

  • Limit your survey to 5 questions. 
  • Make sure that the questions are concise and clear. 
  • Be mindful of all possible answers and always add the “Other” choice field in your survey.
  • Use customizations to improve response rates.
  • Before publishing the survey, test it with your co-workers and ask for feedback.

Don’t get discouraged when you see that only 2-10% of your customers answered the survey. People might enjoy your product but they might not have the time to respond to your survey. At the same time, be aware of the fact that your results represent the opinion only of those people who took the survey. That’s why it’s important to track in-product sessions to compare analytics data to survey responses and identify differences and similarities. 

If you really want to know your customers, use every chance to talk to them to pick up nuances that you cannot get out of a survey. Use the time with your customers, to try to identify the main problem they’re trying to solve by using your product. 

Muhammad Zohaib works as a Technology Analyst at the United Bank Limited and has conducted hundreds of user-research surveys suggests to:

  • Think of the user experience when crafting your survey and analyze your users’ session data.
  • When crafting the survey use the language your customers use to talk about your product.
  • Give users the freedom to choose the questions they will answer, don’t make all questions mandatory.
  • Make responses anonymous whenever possible to avoid social desirability bias.

Summary

Collecting and analyzing user responses is an excellent way to start improving customer retention, increasing brand loyalty and making informed business decisions. Even though it’s close to impossible to remove all forms of bias from your surveys, if you use the best practices we’ve outlined to design your surveys, target them with the user experience in mind and test them beforehand with an eye for priming, you can minimize the impact of bias on the results.

You can always rely on microsurvey tools like Appzi to create fully customizable questionnaires for in-product user experience research. Begin your user research journey by signing up for Appzi for free today.