šŸ’  Optimism Bias

When looking to the future, we tend to overestimate the good stuff and underestimate the bad.

This is a draft chapter from my new book; Security Gems: Using Behavioural Economics to Improve Cybersecurity (working title).

Subscribe to read new chapters as I write them.

šŸ’  In Sickness and In Health

Marriage. Itā€™s a wonderful thing, isnā€™t it?

In the Western world, the numbers donā€™t agree. Divorce rates are about 40 percent.

That means that out of five married couples, two will end up in divorce. But when you ask newlyweds about their own likelihood of divorce, they estimate it at zero percent.

Good luck to them!

Optimism bias is sometimes used interchangeably with ā€˜overconfidenceā€™, and refers to the phenomenon whereby individuals believe they are less likely than others to experience a negative event.

As humans we need some level of optimism, if we went in to marriage thinking it would end in divorce, marriage simply would not exists.

The optimism bias is an intriguing concept that comes with a host of benefits, such as shielding us from depression and ensuring we respond positively to failure.

Sadly, though, the optimism bias in cyber security leaves us overly-vulnerable to cyber attack.

šŸ’  It’ll never happen to me

When I was growing up, there was a kid in my neighbourhood who loved climbing trees. I was always suspicious one of his parents was a monkey.

He’d shoot up them, without a second thought.

Once, thirty metres in the air, a branch broke beneath him. All of us standing below heard the crack. It sounded like lightning, followed by a heavy thud as it hit the ground

Luckily he managed to quickly reach out and grab a branch above, saving himself from a long fall.

Whilst the slip didn’t bring him back down to earth, it did bring him back to reality. It took him the rest of the day to climb back down. And weeks before we saw him up another tree.

The dangers of being overly optimistic or self-confident can often blind us to the very high likelihood of negative outcomes.

When there’s nothing to warn us of our impending doom we get even more reckless.

Drink and drug driving is a massive problem, and is in a large part a result of our unbounding optimism.

“I’ve only had a couple of beers”, offers no solace to the family whose love one has been killed as a result of impaired reaction times.

Nightclubs in Germany came up with a brilliant idea to reduce the problem of their patrons jumping into cars after a night on the tiles; piss screens.

Urinals allowed drivers to steer a car in a video games using their pee. Aim left to go left. Right to go right.

If you’re too slow or swerve too much, that is to pee on the blokes foot next to them, the car would crash. “Too pissed to drive”, the screen would read, along with the number of the local taxi firm.

Again, in life we need moments to peg us back to reality.

When people receive emails they donā€™t necessarily treat them with the suspicion they deserve.

Far too often, weā€™re optimistic about the outcome of clicking links, and end up clicking malicious links or opening malicious attachments.

Wether it’s drink driving, or clicking an email. Both can have catastrophic consequences.

Facebook do a great job of warning us about the result of our actions. Click an external link on your newsfeed and they’ll make you confirm the link shown is where you want to end up.

The aim here is to make the negative effects and losses of a certain action clear to the individual, and offer a clear, safer alternative.

Sadly Facebook don’t do this with uploading drunk photos yet.

šŸ’  It’ll happen to them

Now, I’m not advocating we all become pessimists. World economies rely on optimism.

Entrepreneurs need optimism.

Do you ever find yourself in situations wondering ā€œhow hard could it be?ā€.

As an amateur home-chef, I have a particularly bad habit of asking this type of question when dining out. How hard could it be to create a menu? Cook the food? Leave the customers wanting more?

I make a great Pad Thai.

In my town one particular restaurant unit has changed hands five times in as many years. Italian. Indian. Thai. Greek. Italian, again.

It’s not unusual. In some cities, the chance of restaurant failure in the first year can be as high as 90%. That is, nine out of every ten restaurants opened will fail!

Nine in ten! Who would want to open a new restaurant?

Restaurateurs know the numbers, but despite the well-documented failure rates, they often donā€™t think they apply to them. They might argue their concept is different to the others, their restaurant is in a better part of town, or the cuisine is seeing new popularity.

But do they really have a better chance of success than others trying the same thing?

In the majority of cases, no.

The problem is we don’t know the reason behind the facts. We don’t know a lot about others, but know a lot about ourselves.

Weā€™re optimistic about ourselves, weā€™re optimistic about our kids, weā€™re optimistic about our families, but weā€™re not so optimistic about the guy sitting next to us, and weā€™re pessimistic about the fate of our fellow citizens and the fate of our country.

This plagues those responsible for creating public health messaging.

One in two UK people will be diagnosed with cancer in their lifetime. But despite the odds most people don’t think they’ll get cancer [1].

38 percent of cancer cases are preventable in the UK. 15 percent of that can be attributed to stopping smoking.

Yet millions of people still smoke, pouring their hard earned money into the pursuit of lowering their health outcomes.

People explain it away. They go to the gym everyday. Other smokers don’t. They don’t drink, like other smokers.

Comparative optimism, where we can’t make a direct comparison, convinces us others are more likely to suffer negative experiences than we are ourselves.

Studies around peoples perceived privacy risks, like unauthorised access to accounts and sharing of personal information, is much more likely to happen to other people [2].

Almost half of all UK businesses suffered some form of cyber security breach in 2020 [3].

Yet companies don’t think it will happen to them.

Itā€™s why we can ignore network security risks while at the same time reading about other companies that have been breached. Itā€™s why we think we can get by where others failed.

Optimism induced invincibility needs to be accounted for, and removed. You are no better than your peers, mostly.

šŸ’  Prevention is better than cure

Skiing. Windsurfing. Rock climbing. These are the kinds of things I love to do on holiday.

Health insurance companies don’t like me doing them. I know this because they charge me a hefty premium for coverage.

Previously I was guilty of questioning if travel insurance was worth the money.

Whilst speaking to the Swiss Mountain Rescue team one Winter, they told me just how much it cost to be evacuated via helicopter. About $100 per minute. And that’s from takeoff to landing.

Perceptions of actual risk can be clouded by optimism. I don’t go on holiday to break a leg, but the chance is pretty high.

Itā€™s not just that we donā€™t think bad things can happen to us or are more likely to happen to someone else. Weā€“all things being equalā€“believe that good outcomes are more probable than bad outcomes.

In one study, participants were given a list of 18 positive and 24 negative events, like getting a good job after graduation, developing a drinking problem, and so on [4].

Overall, they considered themselves 15% more likely than others to experience positive events, and 20% less likely than others to experience negative events.

People are more likely to accept risks if they feel they have some control over them.

Here we see the feeling of security diverging from the reality of security.

Controlling for this feeling is important.

We all know someone that has “seen it all”.

Experience often trumps decision making. It offers a sense of security.

But never let it cloud the actual risks, which should be assessed with an eye of experience, but also an eye of fatalism.

šŸ’  Security Gems

You are not invincible.

  • Set a “base rate”:Ā  Take an outside view, meaning we should look at base rates for our estimates as if we are looking at someone elseā€™s chances.
  • Conduct a premortem: before making a decision predict how a project or strategy could fail and then work backward to prevent these issues.
  • Make impending negative events caused by over-optimism clear: Bringing negative events to our mind just before weā€™re likely to engage in an undesirable act can be a good behaviour change technique.
  • Use positive information motivate: Instead of telling people why they shouldnā€™t do something, convince them with the benefits of an alternative. Remember our optimism bias leads us to think weā€™re less likely to suffer negative outcomes compared to others.
  • Beware of feeling secure: Take a risk based approach to security. Best practises are good to follow, but make sure they address the critical issues.

[1] Cancer risk statistics
[2] Optimistic bias about online privacy risks
[3] Almost half of UK businesses suffered a cyber attack in past year
[4] Unrealistic Optimism about Future Life Events

Security Gems: Using Behavioural Economics to Improve Cybersecurity

This post is a draft chapter from my new book. Pardon the typos.

Subscribe to read new chapters as I write them.

šŸ’  Confirmation Bias

We seek out or interpret information that confirms our preconceptions, avoiding things that challenge them.

This is a draft chapter from my new book; Security Gems: Using Behavioural Economics to Improve Cybersecurity (working title).

Subscribe to read new chapters as I write them.

šŸ’  Paying for confirmation

According to the flat Earth model of the universe, the sun and the moon are the same size.

Youā€™ll find credible looking mathematical models that argue the theory. Photographs taken from a plane showing a flat horizon. Queries about how the seas could ever exist if the earth was round.

You wonā€™t find calculations from Eratosthenes who is credited for discovering the earth was round. Photographs taken from space of a round planet. Or mentions of gravity, which holds the water in the seas.

Or does it?

As humans we have a disposition to confirm our beliefs by exclusively searching for information that supports a hunch while excluding opposing data.

Confirmation bias isnā€™t limited to conspiracy theorists. It causes us to vote for politicians, investors to make poor decisions, businesses to focus on the wrong ideas, and almost certainly led you to buy this book.

During the 2008 US presidential election, Valdis Krebs analysed purchasing trends on Amazon. People who already supported Obama were the same people buying books which painted him in a positive light. People who already disliked Obama were the ones buying books painting him in a negative light. [1]

People werenā€™t buying books for the information. They were buying them for the confirmation.

I’m in no doubt the people buying this book have a predisposition for product psychology.

Sound like you?

šŸ’  Biased Search for Information

I love the word ā€œyesā€.

Yes, have an extra slice of cake. Yes, you do look good today. Yes, you are the best.

Experiment after experiment has shown that people tend to ask questions that are designed to yield a ā€œyesā€.

This is also known as congruence heuristic [2].

Google search histories are a good demonstration of the affirmative questions we all love to ask.

ā€œAre cats better than dogs?ā€

We prime Google that cats are indeed better than dogs. Google hears we have a preference for cats. Google plays ball, listing sites detailing reasons why cats are better than dogs.

ā€œAre dogs better than cats?ā€

The same question phrased differently produces entirely different results. Now dogs are better.

ā€œWhich is better; cats or dogs?ā€

Or;

“What is the best pet for [my situation]?”.

Would have been better questions. Obviously the answer is always dogs.

Affirmative approaches to reasoning are common in security.

Analysts enter an investigation digging for an answer they really want. They are worried about their manager pulling them up because they’ve not found anything juicy. The CISO needs their shiny dashboard showing number of threats detected.

Teams lose sight of the bigger picture.

Such an approach creates blindspots because people are looking for what they know, instead of considering other possibilities, the negative test cases.

šŸ’  Biased Interpretation

I hate the word “No”.

No, you can’t have an extra slice of cake. No, you don’t look good today. No, you are not the best.

It’s hard to accept something that conflicts with what we believe. So-much-so our brains have developed a coping mechanism of sorts.

Imagine youā€™ve spent years of research into a particular area of study.

Late nights in the lab trying to uncover evidence to support you hypothesis. Weekends spent fretting over calculations. Months lost scouring obscure libraries.

All to prove the world is flat.

So much knowledge makes it easy to explain away a “no”.

A picture of earth from space.

Thatā€™s Hollywood magic at work.

Tides.

Well, ā€œIsaac Newton is said to have considered the tides to be the least satisfactory part of his theory of gravitationā€. ā€œDuh!ā€. [3]

People tend to not change their beliefs on complex issues even after being provided with research because of the way they interpret the evidence.

Capital punishment is another polarising issue, but one that also draws on our moral compass.

In one experiment, a mix of participants who were either in support of, or against capital punishment were shown the same two studies on the subject.

After reading the detailed descriptions of the studies, participants still held their initial beliefs and supported their reasoning by providing ā€œconfirmingā€ evidence from the studies and rejecting any contradictory evidence, or considering it inferior to the ā€œconfirmingā€ evidence. [4]

We can all be guilty of trying to explain aways why things that don’t conform to what we believe.

“Well, that could never happen. Our firewall will block that type of thing”.

šŸ’  Backfire effective

And we’re a stubborn bunch.

Iā€™ve had some silly arguments in my time. Backing down in the heat of an argument with a partner can be hard at the time, but laughable an hour later.

Politics is a similarly laughable pursuit.

Many people hold an allegiance to the same political party their whole life.

Democrats questioned why people still voted Republican when Trump was on the card, despite of all the evidence questioning the reality of his claims to “Make America Great Again”.

Evidence might hold a strong position in the court of law. In the court of public opinions it’s not so strong.

In fact, not only is it not so strong, it can work against our reasoning! Peopleā€™s preexisting beliefs are not only explained away when challenged by contradictory evidence, they have been shown to actually get stronger! [5]

All is not lost though.

Whilst one piece of disconfirming evidence does not result in a change in peopleā€™s views, it has been shown a constant flow of credible refutations can correct misinformation and misconceptions.

Think about how you disseminate your research.

šŸ’  Biased Memory

Before forensic science became an integral part of the criminal justice system, eyewitness accounts were the basis of a prosecutorā€™s case.

The problem is our memory just isnā€™t particularly good. We remember some things and forget others. It tries to link memories together for easier recall, often falling victim to confirmation bias, amongst others in the process.

ā€œWas the car speeding or not speeding, maā€™am?ā€.

ā€œYes, officer. I heard the engine revving loudly.ā€

Confirmation bias influences eyewitnesses to make non-factual assumptions.

A revving engine might be linked to speeding in one mind. A mechanic might recognise this as a badly tuned engine, completely unrelated to speed.

Hundreds of wrongful convictions have been overturned in recent years as a result of cases bought solely on eyewitness accounts for this very reason.

The future is strongly influenced by memories of experiences in our past. It’s a fundamental to becoming the best.

Which is great if you’re trying to perfect a free kick into the top corner, but often falls short in many other areas. Like reading the resumes of job applicants.

Oxford University; advance to interview. Likes cats; nope.

In one scenario, individuals were asked to read a womanā€™s profile detailing her extroverted and introverted skills. Half were asked to assess her for either a job of a librarian or salesperson.

Those assessing her as a salesperson better recalled extroverted traits while the other group recalled more examples of introversion [6]. Their memories told them the best sales were extroverted and vice-versa.

Before long your team talks the same, thinks the same, and dresses the same. They thrive of validating their same outlook on the world.

To quote Eminem; “Would the Real Slim Shady please stand up?”.

Management consultants love to hark on about the benefits of seeing things from a different perspective. And they’re right.

Sometimes a breath of fresh air can give you a new take on security strategy.

šŸ’  Security Gems

Try to prove yourself wrong.

  • Be careful with your research:Ā Read entire articles, rather than forming conclusions based on the headlines and pictures.Ā  Search for credible evidence presented in articles from a diverse range of sources.
  • Prove assumptions wrong: Warren Buffett, one of the most successful investors of our time, is well aware of confirmation bias and one of his first actions before making an investment decision is to seek opinions that contradict his own.
  • Plan for failure: When we understand that our first assumptions will not be correct and plan for failure, we allow teams to find the correct answer instead of going with the simple and easy hypothesis.
  • Data helps, but be careful: Qualitative measures are much better to use in arguments due to their inherent factual nature. However, you should make it clear how data points should be interpreted.
  • Surround yourself with a diverse group of people: Try to build a diverse team of individuals. Seek out people that challenge your opinions, perhaps someone in a different team, or assign someone on your team to play ā€œdevilā€™s advocateā€ for major decisions.

[1] New Political Patterns
[2] Heuristics and Biases in Diagnostic Reasoning (Baron, 2000)
[3] Earth Not a Globe
[4] Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence (Lord, Ross, & Lepper, 1979)
[5] The Backfire Effect
[6] Testing hypotheses about other people: The use of historical knowledge (Snyder, M., & Cantor, N.,1979)

Security Gems: Using Behavioural Economics to Improve Cybersecurity

This post is a draft chapter from my new book. Pardon the typos.

Subscribe to read new chapters as I write them.

šŸ’  The Isolation Effect

We remember things that stand out in the crowd. But different doesnā€™t necessarily mean itā€™s important.

This is a draft chapter from my new book; Security Gems: Using Behavioural Economics to Improve Cybersecurity (working title).

Subscribe to read new chapters as I write them.

šŸ’  Standing out is not such a badĀ thing

To ā€œstand out like a sore thumbā€ implies that something is noticed because it is very different from the things around it.

Iā€™m often guilty of being the sore thumb. Dressed in shorts mid-winter, whilst those around are being warmed by five layers of clothing.

One of the factors behinds EasyJetā€™s success, arguably the pioneer of the low-cost flight, was to stick out like a sore thumb. The companies early advertising consisted of little more than the airlineā€™s telephone booking number painted in bright orange on the side of its aircraft.

ā€œHave you heard of that orange airline?ā€, people would ask.

Have you ever highlighted information in a book? Then you too have used this effect to your advantage.

Psychologists have studied why our attention is usually captured by salient, novel, surprising, or distinctive stimuli. Probably using highlighter during their research.

Product designers understand our fascination with things that stand out and will spend hours perfecting the size, colour and shape of something to grab your attention, directing you on the path they want you to take.

Good products guide users to the important features and functions by making them stand out.

The big red flashing bell indicating a security alert should be distinctive, drawing attention and making it very clear that it needs to be looked at.

šŸ’  Information overload can make standing out difficult

Being able to draw attention to something in the age of information overload is vital.

An email received from a friend or family member sticks out amid a sea of unfamiliar names.

A letter where the address is handwritten stands out, allowing me to easily filter boring correspondence from correspondence I will enjoy reading.

ā€œYOUā€™VE WON A PRIZEā€

ā€œYOUR ACCOUNT HAS BEEN COMPROMISEDā€

These email subject lines have a similar effect.

Not only is someone shouting at you, theyā€™re also warning you of a potentially serious event that arouses a sense of urgency.

Itā€™s not your everyday (or hourly); ā€œSally has liked your photos taken in 2003 on Facebookā€ email. Itā€™s serious.

In phishing school [2], youā€™ll find classes titled: How to grab a victims attention.

Successfully grabbing the attention of someone browsing their inbox is the first part of a successful campaign. You should expect the attackers to have aced that class.

šŸ’  Not standing out can be disastrous

Digging deeper into the email inbox, or not as the case may be, itā€™s clear our brains werenā€™t designed to deal with mountains of spam.

So called alert fatigue highlights this weakness. People stop noticing alerts, emails, texts, and [INSERT LATEST COOL MESSAGING SERVICE HERE] because there are simply too many.

People become desensitised to similar things being shown to them every day.

I once sat with a client who somewhat proudly proclaimed the ā€œAlertsā€ folder in his inbox stood at 10,000 unread emails. That was nothing he assured me, his colleagues folder clocked closer to six digits!

You donā€™t want to foster this culture.

When my fire alarm sounds, my heart rate accelerates as adrenaline is pumped into my blood stream. The noise that stands out. Itā€™s important. It immediately draws all my attention. Yes, even from an oh so cute cat video.

Security alerting needs to have the same effect. To point you to real fires. To prioritise what is most important. Missing critical alerts, emails, texts, or warnings of actual fires does not typically end well.

šŸ’  The art of deception

The ability to recognise and remember things that stand out has long proved advantageous to our species.

As hunter gatherers being able to determine something that stood out was vital in finding food and avoiding becoming food.

Evolution has long realised standing out is a disadvantage.

Chameleons.

The Artic Hare is another great example of the evolutionary importance of blending in.

In the winter their bright white coats hide them from predators amongst a backdrop of snow. In spring, the hareā€™s colours change to blue-gray in approximation of local rocks and vegetation.

Humans are no different.

Go to a club on a Saturday night and watch the herds of men and women dressed head to toe in clubbing uniforms.

During my college years flannel shirts were the ā€œin-thingā€. One night I bumped into 3 other guys, who all had a great taste in fashion I will add, all wearing the same shirt.

Militaries around the world understand the importance of camouflage. Soldiers donā€™t want to stand out. Itā€™s a matter of life and death on the battlefield.

Neither do criminals.

Actors know downloading terabytes of data in a short period of time will stand out. Instead they slowly exfiltrate data over months patterns donā€™t stand out.

Malware is designed to act like a user, disguising itself as a normal process on an endpoint.

Yet so much of cyber security is focused on identifying the anomalies.

Sure, anomalies are important. Itā€™s why so many vendors consistently demo that there product proudly detected ā€œ3 failed logons, from 3 different locations, in 3 seconds, for 1 accountā€.

However, the things that stick out, in a world where the bad guys are doing everything they can to stay anonymous, are only part of the story.

šŸ’  Breaking camouflage

In the early days of map making it took a lot of time to produce a map.

Companies had to hire someone to go out and walk every street.

Needless to say, plagiarism plagued the pre-computerised map making industry.

In the 1930ā€™s, General Drafting, a map making company, came up with an ingenious idea. In their map of New York State they included a copyright trap; a fictitious place, Agloe [3].

Fast forward a few years and the company spotted Agloe detailed on a map produced by one of their fiercest competitors, Rand McNally.

Such was the problem, Agloe continued to appear on a number of maps up until the 1990s. I can imagine the disappointed faces of day-trippers, and the ensuing arguments about wrong turns.

These traps have come to be affectionately known as Mountweazels [4]: a bogus entry deliberately inserted in a reference work. Prizes for anyone who spots the one in this book.

Like Mountweazels, honeypots are similar traps used in computer networks.

A honeypot mimics a system that may be attractive to an attacker, but would only ever be accessed by someone snooping around.

Like a motion activated light illuminates intruders attracted by the shiny objects in your house, honeypots illuminates attackers attracted by the shiny potential they offer.

šŸ’  SecurityĀ Gems

If you want people to remember something, make it stand out.

  • Make the right path clear: If you want a user to take action in a certain way, guide them by making the route stand out.
  • Beware of normal: itā€™s easy to remember things that stand out, but distinctiveness is not the only attribute you should be worried about.
  • Donā€™t focus on anomalies: entice those operating covertly into the open. Break their camouflage.
  • Donā€™t make yourself obvious: Remember, attackers are drawn to things that stand out.
  • Communicate effectively: Make important communications and events distinctive in a way that makes sense. Remove the bullshit.
  • Think about methods of communication: Sending important alert to mobile phone might make them stand out over email alone.

[1] Salience, Attention, and Attribution: Top of the Head Phenomena (Taylor & Fiske, 1978)
[2] Completely fictitious.
[3] Agloe, New York (Wikipedia)
[4] Fictitious entry (Wikipedia)

Security Gems: Using Behavioural Economics to Improve Cybersecurity

This post is a draft chapter from my new book. Pardon the typos.

Subscribe to read new chapters as I write them.