💠 Confirmation Bias

We seek out or interpret information that confirms our preconceptions, avoiding things that challenge them.

This is a draft chapter from my new book; Security Gems: Using Behavioural Economics to Improve Cybersecurity (working title).

Subscribe to read new chapters as I write them.

💠 Paying for confirmation

According to the flat Earth model of the universe, the sun and the moon are the same size.

You’ll find credible looking mathematical models that argue the theory. Photographs taken from a plane showing a flat horizon. Queries about how the seas could ever exist if the earth was round.

You won’t find calculations from Eratosthenes who is credited for discovering the earth was round. Photographs taken from space of a round planet. Or mentions of gravity, which holds the water in the seas.

Or does it?

As humans we have a disposition to confirm our beliefs by exclusively searching for information that supports a hunch while excluding opposing data.

Confirmation bias isn’t limited to conspiracy theorists. It causes us to vote for politicians, investors to make poor decisions, businesses to focus on the wrong ideas, and almost certainly led you to buy this book.

During the 2008 US presidential election, Valdis Krebs analysed purchasing trends on Amazon. People who already supported Obama were the same people buying books which painted him in a positive light. People who already disliked Obama were the ones buying books painting him in a negative light. [1]

People weren’t buying books for the information. They were buying them for the confirmation.

I’m in no doubt the people buying this book have a predisposition for product psychology.

Sound like you?

💠 Biased Search for Information

I love the word “yes”.

Yes, have an extra slice of cake. Yes, you do look good today. Yes, you are the best.

Experiment after experiment has shown that people tend to ask questions that are designed to yield a “yes”.

This is also known as congruence heuristic [2].

Google search histories are a good demonstration of the affirmative questions we all love to ask.

“Are cats better than dogs?”

We prime Google that cats are indeed better than dogs. Google hears we have a preference for cats. Google plays ball, listing sites detailing reasons why cats are better than dogs.

“Are dogs better than cats?”

The same question phrased differently produces entirely different results. Now dogs are better.

“Which is better; cats or dogs?”

Or;

“What is the best pet for [my situation]?”.

Would have been better questions. Obviously the answer is always dogs.

Affirmative approaches to reasoning are common in security.

Analysts enter an investigation digging for an answer they really want. They are worried about their manager pulling them up because they’ve not found anything juicy. The CISO needs their shiny dashboard showing number of threats detected.

Teams lose sight of the bigger picture.

Such an approach creates blindspots because people are looking for what they know, instead of considering other possibilities, the negative test cases.

💠 Biased Interpretation

I hate the word “No”.

No, you can’t have an extra slice of cake. No, you don’t look good today. No, you are not the best.

It’s hard to accept something that conflicts with what we believe. So-much-so our brains have developed a coping mechanism of sorts.

Imagine you’ve spent years of research into a particular area of study.

Late nights in the lab trying to uncover evidence to support you hypothesis. Weekends spent fretting over calculations. Months lost scouring obscure libraries.

All to prove the world is flat.

So much knowledge makes it easy to explain away a “no”.

A picture of earth from space.

That’s Hollywood magic at work.

Tides.

Well, “Isaac Newton is said to have considered the tides to be the least satisfactory part of his theory of gravitation”. “Duh!”. [3]

People tend to not change their beliefs on complex issues even after being provided with research because of the way they interpret the evidence.

Capital punishment is another polarising issue, but one that also draws on our moral compass.

In one experiment, a mix of participants who were either in support of, or against capital punishment were shown the same two studies on the subject.

After reading the detailed descriptions of the studies, participants still held their initial beliefs and supported their reasoning by providing “confirming” evidence from the studies and rejecting any contradictory evidence, or considering it inferior to the “confirming” evidence. [4]

We can all be guilty of trying to explain aways why things that don’t conform to what we believe.

“Well, that could never happen. Our firewall will block that type of thing”.

💠 Backfire effective

And we’re a stubborn bunch.

I’ve had some silly arguments in my time. Backing down in the heat of an argument with a partner can be hard at the time, but laughable an hour later.

Politics is a similarly laughable pursuit.

Many people hold an allegiance to the same political party their whole life.

Democrats questioned why people still voted Republican when Trump was on the card, despite of all the evidence questioning the reality of his claims to “Make America Great Again”.

Evidence might hold a strong position in the court of law. In the court of public opinions it’s not so strong.

In fact, not only is it not so strong, it can work against our reasoning! People’s preexisting beliefs are not only explained away when challenged by contradictory evidence, they have been shown to actually get stronger! [5]

All is not lost though.

Whilst one piece of disconfirming evidence does not result in a change in people’s views, it has been shown a constant flow of credible refutations can correct misinformation and misconceptions.

Think about how you disseminate your research.

💠 Biased Memory

Before forensic science became an integral part of the criminal justice system, eyewitness accounts were the basis of a prosecutor’s case.

The problem is our memory just isn’t particularly good. We remember some things and forget others. It tries to link memories together for easier recall, often falling victim to confirmation bias, amongst others in the process.

“Was the car speeding or not speeding, ma’am?”.

“Yes, officer. I heard the engine revving loudly.”

Confirmation bias influences eyewitnesses to make non-factual assumptions.

A revving engine might be linked to speeding in one mind. A mechanic might recognise this as a badly tuned engine, completely unrelated to speed.

Hundreds of wrongful convictions have been overturned in recent years as a result of cases bought solely on eyewitness accounts for this very reason.

The future is strongly influenced by memories of experiences in our past. It’s a fundamental to becoming the best.

Which is great if you’re trying to perfect a free kick into the top corner, but often falls short in many other areas. Like reading the resumes of job applicants.

Oxford University; advance to interview. Likes cats; nope.

In one scenario, individuals were asked to read a woman’s profile detailing her extroverted and introverted skills. Half were asked to assess her for either a job of a librarian or salesperson.

Those assessing her as a salesperson better recalled extroverted traits while the other group recalled more examples of introversion [6]. Their memories told them the best sales were extroverted and vice-versa.

Before long your team talks the same, thinks the same, and dresses the same. They thrive of validating their same outlook on the world.

To quote Eminem; “Would the Real Slim Shady please stand up?”.

Management consultants love to hark on about the benefits of seeing things from a different perspective. And they’re right.

Sometimes a breath of fresh air can give you a new take on security strategy.

💠 Security Gems

Try to prove yourself wrong.

  • Be careful with your research: Read entire articles, rather than forming conclusions based on the headlines and pictures.  Search for credible evidence presented in articles from a diverse range of sources.
  • Prove assumptions wrong: Warren Buffett, one of the most successful investors of our time, is well aware of confirmation bias and one of his first actions before making an investment decision is to seek opinions that contradict his own.
  • Plan for failure: When we understand that our first assumptions will not be correct and plan for failure, we allow teams to find the correct answer instead of going with the simple and easy hypothesis.
  • Data helps, but be careful: Qualitative measures are much better to use in arguments due to their inherent factual nature. However, you should make it clear how data points should be interpreted.
  • Surround yourself with a diverse group of people: Try to build a diverse team of individuals. Seek out people that challenge your opinions, perhaps someone in a different team, or assign someone on your team to play “devil’s advocate” for major decisions.

[1] New Political Patterns
[2] Heuristics and Biases in Diagnostic Reasoning (Baron, 2000)
[3] Earth Not a Globe
[4] Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence (Lord, Ross, & Lepper, 1979)
[5] The Backfire Effect
[6] Testing hypotheses about other people: The use of historical knowledge (Snyder, M., & Cantor, N.,1979)

Security Gems: Using Behavioural Economics to Improve Cybersecurity

This post is a draft chapter from my new book. Pardon the typos.

Subscribe to read new chapters as I write them.

💠 The Isolation Effect

We remember things that stand out in the crowd. But different doesn’t necessarily mean it’s important.

This is a draft chapter from my new book; Security Gems: Using Behavioural Economics to Improve Cybersecurity (working title).

Subscribe to read new chapters as I write them.

💠 Standing out is not such a bad thing

To “stand out like a sore thumb” implies that something is noticed because it is very different from the things around it.

I’m often guilty of being the sore thumb. Dressed in shorts mid-winter, whilst those around are being warmed by five layers of clothing.

One of the factors behinds EasyJet’s success, arguably the pioneer of the low-cost flight, was to stick out like a sore thumb. The companies early advertising consisted of little more than the airline’s telephone booking number painted in bright orange on the side of its aircraft.

“Have you heard of that orange airline?”, people would ask.

Have you ever highlighted information in a book? Then you too have used this effect to your advantage.

Psychologists have studied why our attention is usually captured by salient, novel, surprising, or distinctive stimuli. Probably using highlighter during their research.

Product designers understand our fascination with things that stand out and will spend hours perfecting the size, colour and shape of something to grab your attention, directing you on the path they want you to take.

Good products guide users to the important features and functions by making them stand out.

The big red flashing bell indicating a security alert should be distinctive, drawing attention and making it very clear that it needs to be looked at.

💠 Information overload can make standing out difficult

Being able to draw attention to something in the age of information overload is vital.

An email received from a friend or family member sticks out amid a sea of unfamiliar names.

A letter where the address is handwritten stands out, allowing me to easily filter boring correspondence from correspondence I will enjoy reading.

“YOU’VE WON A PRIZE”

“YOUR ACCOUNT HAS BEEN COMPROMISED”

These email subject lines have a similar effect.

Not only is someone shouting at you, they’re also warning you of a potentially serious event that arouses a sense of urgency.

It’s not your everyday (or hourly); “Sally has liked your photos taken in 2003 on Facebook” email. It’s serious.

In phishing school [2], you’ll find classes titled: How to grab a victims attention.

Successfully grabbing the attention of someone browsing their inbox is the first part of a successful campaign. You should expect the attackers to have aced that class.

💠 Not standing out can be disastrous

Digging deeper into the email inbox, or not as the case may be, it’s clear our brains weren’t designed to deal with mountains of spam.

So called alert fatigue highlights this weakness. People stop noticing alerts, emails, texts, and [INSERT LATEST COOL MESSAGING SERVICE HERE] because there are simply too many.

People become desensitised to similar things being shown to them every day.

I once sat with a client who somewhat proudly proclaimed the “Alerts” folder in his inbox stood at 10,000 unread emails. That was nothing he assured me, his colleagues folder clocked closer to six digits!

You don’t want to foster this culture.

When my fire alarm sounds, my heart rate accelerates as adrenaline is pumped into my blood stream. The noise that stands out. It’s important. It immediately draws all my attention. Yes, even from an oh so cute cat video.

Security alerting needs to have the same effect. To point you to real fires. To prioritise what is most important. Missing critical alerts, emails, texts, or warnings of actual fires does not typically end well.

💠 The art of deception

The ability to recognise and remember things that stand out has long proved advantageous to our species.

As hunter gatherers being able to determine something that stood out was vital in finding food and avoiding becoming food.

Evolution has long realised standing out is a disadvantage.

Chameleons.

The Artic Hare is another great example of the evolutionary importance of blending in.

In the winter their bright white coats hide them from predators amongst a backdrop of snow. In spring, the hare’s colours change to blue-gray in approximation of local rocks and vegetation.

Humans are no different.

Go to a club on a Saturday night and watch the herds of men and women dressed head to toe in clubbing uniforms.

During my college years flannel shirts were the “in-thing”. One night I bumped into 3 other guys, who all had a great taste in fashion I will add, all wearing the same shirt.

Militaries around the world understand the importance of camouflage. Soldiers don’t want to stand out. It’s a matter of life and death on the battlefield.

Neither do criminals.

Actors know downloading terabytes of data in a short period of time will stand out. Instead they slowly exfiltrate data over months patterns don’t stand out.

Malware is designed to act like a user, disguising itself as a normal process on an endpoint.

Yet so much of cyber security is focused on identifying the anomalies.

Sure, anomalies are important. It’s why so many vendors consistently demo that there product proudly detected “3 failed logons, from 3 different locations, in 3 seconds, for 1 account”.

However, the things that stick out, in a world where the bad guys are doing everything they can to stay anonymous, are only part of the story.

💠 Breaking camouflage

In the early days of map making it took a lot of time to produce a map.

Companies had to hire someone to go out and walk every street.

Needless to say, plagiarism plagued the pre-computerised map making industry.

In the 1930’s, General Drafting, a map making company, came up with an ingenious idea. In their map of New York State they included a copyright trap; a fictitious place, Agloe [3].

Fast forward a few years and the company spotted Agloe detailed on a map produced by one of their fiercest competitors, Rand McNally.

Such was the problem, Agloe continued to appear on a number of maps up until the 1990s. I can imagine the disappointed faces of day-trippers, and the ensuing arguments about wrong turns.

These traps have come to be affectionately known as Mountweazels [4]: a bogus entry deliberately inserted in a reference work. Prizes for anyone who spots the one in this book.

Like Mountweazels, honeypots are similar traps used in computer networks.

A honeypot mimics a system that may be attractive to an attacker, but would only ever be accessed by someone snooping around.

Like a motion activated light illuminates intruders attracted by the shiny objects in your house, honeypots illuminates attackers attracted by the shiny potential they offer.

💠 Security Gems

If you want people to remember something, make it stand out.

  • Make the right path clear: If you want a user to take action in a certain way, guide them by making the route stand out.
  • Beware of normal: it’s easy to remember things that stand out, but distinctiveness is not the only attribute you should be worried about.
  • Don’t focus on anomalies: entice those operating covertly into the open. Break their camouflage.
  • Don’t make yourself obvious: Remember, attackers are drawn to things that stand out.
  • Communicate effectively: Make important communications and events distinctive in a way that makes sense. Remove the bullshit.
  • Think about methods of communication: Sending important alert to mobile phone might make them stand out over email alone.

[1] Salience, Attention, and Attribution: Top of the Head Phenomena (Taylor & Fiske, 1978)
[2] Completely fictitious.
[3] Agloe, New York (Wikipedia)
[4] Fictitious entry (Wikipedia)

Security Gems: Using Behavioural Economics to Improve Cybersecurity

This post is a draft chapter from my new book. Pardon the typos.

Subscribe to read new chapters as I write them.