forS use, Did you answer “no”?  If so, then you have a bias.  It’s called a blind spot. But don’t be too hard on yourself.  Truth is, we’ve been hardwired for bias since we were monkeys. It was necessary for survival.  See that tiger? Maybe you saw one just like it shred your best friend.  So, next time you see anything that resembles a furry animal with stripes you are going run like hell, right?  Or let’s say you’re out in the woods and see a yummy looking red berry that poisoned your sister?  Note to self: small red berries are bad, bad, bad.  Don’t eat them.

Biases are baked into our brains.  They are unconscious processes our brains use to “cut to the chase”.  A cognitive shortcut. They happen in a split second before we can even detect them.  When faced with a furry animal with stripes are you going to stand there and decide if it’s dangerous?  Not if you want to stay intact.  Run first and think later.  And that’s why it’s impossible to not be biased in some way.  Market researchers need to be on the lookout for their own biases and try to remedy them.  If not, they risk inaccurate research results.

Six Most Common Biases

Blind Spot
A bias survey of 661 people reported that only one person thought he/she was more biased than the average person. The rest said they were less biased than the average person.  Researchers can tend to believe they know how to ask unbiased questions.  Blind spot?  Bingo.

Confirmation Bias
Confirmation bias has been described as an internal “yes man”, echoing back a person’s beliefs. Studies have found repeatedly that people often test hypotheses in a one-sided way, by searching for evidence consistent with their current belief.  Confirmation bias happens when we look for the consequences that we would expect if a belief were true verses if a belief was false.  Asking questions to receive an affirmative answer that supports your theory can lead to gather inaccurate data.

Observer Expectancy Effect
We can inadvertently ask questions in a way that may subtly communicate societal expectations.  People are likely to respond according to what’s in line with social standards they think they ought to meet. For example, “Are your children’s clothes still dirty after you wash them?”  If the clothes of a mother’s child are still dirty after washing them, she may answer “no”.  Mothers (vs fathers) are more focused on the cleanliness their children’s clothes because they believe it is a reflection on them as a mother.   Answering “yes” to that question risks the perception of “bad mother” versus the result of an ineffective laundry detergent or a faulty washing machine.  A better way to ask the question might be, “How well does your laundry detergent/washing machine clean your child’s clothes?”  Drill down from there to reach a deeper level of insight.   Another way to avoid the observer expectancy effect is to pose a question in a way that is about someone else.  For instance, “What do you think others might think when they see a child whose clothes aren’t clean?”

Framing Effect
This is a cognitive bias that happens when we are asked a question in a way that is influenced by the presentation.  The framing effect has positive or negative connotations that come with a consequence such as a gain or loss outcome.  For example, the following question is framed within the context of effective vs ineffective results.  “When shopping for a disinfectant, do you look for a product that kills 95% of all germs over one where only 5% of germs will survive?”  Because, even though the results are the same, people will choose the product that kills 95% of all germs. However, framing can also be used to convince your product is more effective than a competitor’s product.  In this case people will buy a disinfectant that kills 95% of all germs over one where only 5% of germs survive.      

Cultural Bias
We tend to interpret words or actions according to the culturally derived meaning based on a monolithic perception of culture.   However, no culture is one-size-fits-all. There are subcultures and within those there are micro cultures such as terrarrium enthusiasts, biohackers, and pickleball players.  Micro cultures are what comprise niche markets and their members identify themselves through what defines their shared interests.  For instance, the way biohacker culture differentiates itself from mainstream health culture. So, if you identify with the mainstream health culture and you’re doing biohacker market research, remember the slightest hint that you think their beliefs are “weird” will create a wall between you and the biohacker cultures.  What’s more, your findings won’t be able to provide  accurate and valid insights that will help a market of biohackers searching for it’s products. 

Curse of Knowledge
Sometimes we unknowingly assume that others have the same knowledge and understanding as ourselves.   The curse of knowledge means that the more familiar we are with something, the harder it is to put ourselves in the shoes of someone who’s unfamiliar with the same thing. Because of the way our brains work, it’s hard to unlearn what we know. And that makes it difficult to see with fresh eyes.  Ever spent three frustrating hours with assembly instructions where it’s assumed you know things that you don’t?   That’s because the curse of knowledge makes it harder to explain the basics to people who are new to the subject.

How to Keep Bias in Check
The problem with biases is we often don’t know when we have them.  Ask a friends or colleagues if they have observed any signs that suggest you could have a bias. Once a bias comes to your attention, you can do something about it. Don’t be disappointed in yourself about being biased. Simply accept that biases exist and that we all have them. Blame long-gone primitive ancestors that passed down a genetic code that’s necessary to our continued survival.  So, when you discover a bias, it’s not the end of the world.  It’s just the end of the bias.