How many times have you had a conversation about a topic with a close friend and come away wondering how in the world they could believe what they do when they’re obviously wrong? It happens all the time and it creates enormous division – in politics, in our relationships and even in our workplace.
Here’s a perfect example – in our tribal politics, both parties have the same set of facts surrounding Trump’s actions on Ukraine aid, yet 82% of Democrats support impeaching Donald Trump while only 10% of Republicans do. How is it possible for reasonable people to look at the same set of facts and draw such starkly different conclusions?
Confirmation bias is the process of seeking out data to confirm one’s prior beliefs, and whether we like it or not, we all have a natural tendency toward this behavior. In his podcast Hidden Brain (link to the episode), Shankar Vedantam argues that confirmation bias arises out of reliance on people we trust to shape our beliefs. He says emotion plays a bigger role in shaping our perspective than facts.
This is how fake news spreads – when our close friend posts something on social media, we’re more likely to believe it if we trust the person, no matter how questionable the material might be. Essentially, facts are not enough, because they have to break through the trust barrier. Look no further than the debate on climate change or vaccination links to autism as examples of people taking contrary positions despite overwhelming factual evidence.
Confirmation Bias Is Not a New Phenomenon
How prevalent is confirmation bias? Shankar says this is not a recent problem. During the medieval period from the 1300s to the 1700s, Europeans believed in the story of a fantastical lamb that grew from a plant, the so-called Vegetable Lamb of Tartary, simply because the story was told through trusted villagers, circulating much like a whisper game into ever-more distortions of the original message.
During the 1800s, Dr. Ignaz Phillip Semmelweis, a Hungarian physician was researching the high degree of infant mortality during childbirth. He discovered that hand disinfection drastically reduced the mortality rate, so he published his findings and instituted a rule to have doctors wash their hands before performing a delivery. Although the infant mortality rate dropped from 10% to 3%, many doctors challenged Dr. Simmelweis’ findings because they took offense to the notion that, as gentlemen, they had dirty hands (turns out many doctors who delivered babies also worked on dead cadavers and were transferring bacteria during childbirth).
Translating this history to the present day, it’s no wonder we have such a problem of fake news and “alternative facts”, when we have entire social media ecosystems that are tailor-made for storytelling by trusted individuals on a mass scale.
How to Stop Confirmation Bias
If trust and emotion carry more weight than facts when forming our beliefs, then what can we do? According to Shankar, our ability to change our beliefs depends on:
- Our current belief, and our confidence level in that belief
- The new data, and our confidence in that new data
To tip the scales, we have to elicit emotions to change our beliefs. This can be done in two very different ways:
1. Fear – This works in a couple of different ways.
- The fear causes enough stress to generate a desire for relief – ex, I’m a construction laborer, and my president tells me that illegal immigrants are going to take my job
- The fear is used to encourage non-action – ex, I have a newborn, and a trusted friend tells me that vaccination causes autism, sending me a few links to online articles
2. Hope – This is a much better motivator.
The podcast gives a great example of a hospital on the east coast where only 1 in 10 medical staff sanitized their hands before entering a patient’s room. When they posted an electronic board that showed the % of staff washing their hands, the rate went up to 90%. This positive reinforcement and gamification had a wonderful and motivational effect.
Confirmation Bias in the Workplace
In our profession, our beliefs inform our work and our interaction with our colleagues, and if our belief system contradicts the facts on the ground, eventually it will lead to poor decision-making. Be honest with yourself – how often have you searched google until you could find articles that support your position when you had a disagreement with a colleague? That’s confirmation bias at play.
Business (and economic) fiascos like the Fyre Festival don’t just occur because of Billy McFarland’s fraud, they occur because a whole team of employees finds ways to convince themselves that “this will work.” History is littered with product failures from New Coke to RJ Reynold’s smokeless cigarettes, where otherwise highly intelligent employees poured over focus groups, marketing analysis, and product tests, convincing themselves they were on the right track, only to have the market tell them otherwise.
When people in business ignore facts that contradict their beliefs, it creates a cognitive dissonance that will eventually lead to failure. This is particularly acute in a startup environment, where unproven companies must be funded based on a founder’s vision and nothing more. At the beginning of a startup’s journey, we have to “believe” and drive toward the company’s launch.
After that, however, we must embrace Eric Reis’ lean startup concept of validated learning over our “gut”, because the only thing that matters is the market’s reaction to our product. Often our early MVPs are met with poor reception, causing us to pivot. If we hang on to confirmation bias, we will ignore the signs and end up wasting a lot of time, in the immortal words of Ash Maurya, “building something nobody wants.”
Also, check out these Top 8 Leadership Qualities That Make Great Leaders