Confirmation Bias

💠 Confirmation Bias

We seek out or interpret information that confirms our preconceptions, avoiding things that challenge them.

This is a draft chapter from my new book; Security Gems: Using Behavioural Economics to Improve Cybersecurity (working title).

Subscribe to read new chapters as I write them.

💠 Paying for confirmation

According to the flat Earth model of the universe, the sun and the moon are the same size.

You’ll find credible looking mathematical models that argue the theory. Photographs taken from a plane showing a flat horizon. Queries about how the seas could ever exist if the earth was round.

You won’t find calculations from Eratosthenes who is credited for discovering the earth was round. Photographs taken from space of a round planet. Or mentions of gravity, which holds the water in the seas.

Or does it?

As humans we have a disposition to confirm our beliefs by exclusively searching for information that supports a hunch while excluding opposing data.

Confirmation bias isn’t limited to conspiracy theorists. It causes us to vote for politicians, investors to make poor decisions, businesses to focus on the wrong ideas, and almost certainly led you to buy this book.

During the 2008 US presidential election, Valdis Krebs analysed purchasing trends on Amazon. People who already supported Obama were the same people buying books which painted him in a positive light. People who already disliked Obama were the ones buying books painting him in a negative light. [1]

People weren’t buying books for the information. They were buying them for the confirmation.

I’m in no doubt the people buying this book have a predisposition for product psychology.

Sound like you?

💠 Biased Search for Information

I love the word “yes”.

Yes, have an extra slice of cake. Yes, you do look good today. Yes, you are the best.

Experiment after experiment has shown that people tend to ask questions that are designed to yield a “yes”.

This is also known as congruence heuristic [2].

Google search histories are a good demonstration of the affirmative questions we all love to ask.

“Are cats better than dogs?”

We prime Google that cats are indeed better than dogs. Google hears we have a preference for cats. Google plays ball, listing sites detailing reasons why cats are better than dogs.

“Are dogs better than cats?”

The same question phrased differently produces entirely different results. Now dogs are better.

“Which is better; cats or dogs?”

Or;

“What is the best pet for [my situation]?”.

Would have been better questions. Obviously the answer is always dogs.

Affirmative approaches to reasoning are common in security.

Analysts enter an investigation digging for an answer they really want. They are worried about their manager pulling them up because they’ve not found anything juicy. The CISO needs their shiny dashboard showing number of threats detected.

Teams lose sight of the bigger picture.

Such an approach creates blindspots because people are looking for what they know, instead of considering other possibilities, the negative test cases.

💠 Biased Interpretation

I hate the word “No”.

No, you can’t have an extra slice of cake. No, you don’t look good today. No, you are not the best.

It’s hard to accept something that conflicts with what we believe. So-much-so our brains have developed a coping mechanism of sorts.

Imagine you’ve spent years of research into a particular area of study.

Late nights in the lab trying to uncover evidence to support you hypothesis. Weekends spent fretting over calculations. Months lost scouring obscure libraries.

All to prove the world is flat.

So much knowledge makes it easy to explain away a “no”.

A picture of earth from space.

That’s Hollywood magic at work.

Tides.

Well, “Isaac Newton is said to have considered the tides to be the least satisfactory part of his theory of gravitation”. “Duh!”. [3]

People tend to not change their beliefs on complex issues even after being provided with research because of the way they interpret the evidence.

Capital punishment is another polarising issue, but one that also draws on our moral compass.

In one experiment, a mix of participants who were either in support of, or against capital punishment were shown the same two studies on the subject.

After reading the detailed descriptions of the studies, participants still held their initial beliefs and supported their reasoning by providing “confirming” evidence from the studies and rejecting any contradictory evidence, or considering it inferior to the “confirming” evidence. [4]

We can all be guilty of trying to explain aways why things that don’t conform to what we believe.

“Well, that could never happen. Our firewall will block that type of thing”.

💠 Backfire effective

And we’re a stubborn bunch.

I’ve had some silly arguments in my time. Backing down in the heat of an argument with a partner can be hard at the time, but laughable an hour later.

Politics is a similarly laughable pursuit.

Many people hold an allegiance to the same political party their whole life.

Democrats questioned why people still voted Republican when Trump was on the card, despite of all the evidence questioning the reality of his claims to “Make America Great Again”.

Evidence might hold a strong position in the court of law. In the court of public opinions it’s not so strong.

In fact, not only is it not so strong, it can work against our reasoning! People’s preexisting beliefs are not only explained away when challenged by contradictory evidence, they have been shown to actually get stronger! [5]

All is not lost though.

Whilst one piece of disconfirming evidence does not result in a change in people’s views, it has been shown a constant flow of credible refutations can correct misinformation and misconceptions.

Think about how you disseminate your research.

💠 Biased Memory

Before forensic science became an integral part of the criminal justice system, eyewitness accounts were the basis of a prosecutor’s case.

The problem is our memory just isn’t particularly good. We remember some things and forget others. It tries to link memories together for easier recall, often falling victim to confirmation bias, amongst others in the process.

“Was the car speeding or not speeding, ma’am?”.

“Yes, officer. I heard the engine revving loudly.”

Confirmation bias influences eyewitnesses to make non-factual assumptions.

A revving engine might be linked to speeding in one mind. A mechanic might recognise this as a badly tuned engine, completely unrelated to speed.

Hundreds of wrongful convictions have been overturned in recent years as a result of cases bought solely on eyewitness accounts for this very reason.

The future is strongly influenced by memories of experiences in our past. It’s a fundamental to becoming the best.

Which is great if you’re trying to perfect a free kick into the top corner, but often falls short in many other areas. Like reading the resumes of job applicants.

Oxford University; advance to interview. Likes cats; nope.

In one scenario, individuals were asked to read a woman’s profile detailing her extroverted and introverted skills. Half were asked to assess her for either a job of a librarian or salesperson.

Those assessing her as a salesperson better recalled extroverted traits while the other group recalled more examples of introversion [6]. Their memories told them the best sales were extroverted and vice-versa.

Before long your team talks the same, thinks the same, and dresses the same. They thrive of validating their same outlook on the world.

To quote Eminem; “Would the Real Slim Shady please stand up?”.

Management consultants love to hark on about the benefits of seeing things from a different perspective. And they’re right.

Sometimes a breath of fresh air can give you a new take on security strategy.

💠 Security Gems

Try to prove yourself wrong.

  • Be careful with your research: Read entire articles, rather than forming conclusions based on the headlines and pictures.  Search for credible evidence presented in articles from a diverse range of sources.
  • Prove assumptions wrong: Warren Buffett, one of the most successful investors of our time, is well aware of confirmation bias and one of his first actions before making an investment decision is to seek opinions that contradict his own.
  • Plan for failure: When we understand that our first assumptions will not be correct and plan for failure, we allow teams to find the correct answer instead of going with the simple and easy hypothesis.
  • Data helps, but be careful: Qualitative measures are much better to use in arguments due to their inherent factual nature. However, you should make it clear how data points should be interpreted.
  • Surround yourself with a diverse group of people: Try to build a diverse team of individuals. Seek out people that challenge your opinions, perhaps someone in a different team, or assign someone on your team to play “devil’s advocate” for major decisions.

[1] New Political Patterns
[2] Heuristics and Biases in Diagnostic Reasoning (Baron, 2000)
[3] Earth Not a Globe
[4] Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence (Lord, Ross, & Lepper, 1979)
[5] The Backfire Effect
[6] Testing hypotheses about other people: The use of historical knowledge (Snyder, M., & Cantor, N.,1979)

Security Gems: Using Behavioural Economics to Improve Cybersecurity

This post is a draft chapter from my new book. Pardon the typos.

Subscribe to read new chapters as I write them.

Facebook Comments

Product Geek?

Join over 5,000 product geeks and get one email every Monday containing the best excerpts I've read over the previous week.

See some of what you're missing...