πŸ’Ž On our minds working on problems even when we’re not consciously thinking about them (John Cleese)

Graham and I thought it was rather a good sketch. It was therefore terribly embarrassing when I found I’d lost it. I knew Graham was going to be cross, so when I’d given up looking for it, I sat down and rewrote the whole thing from memory. It actually turned out to be easier than I’d expected.

Then I found the original sketch and, out of curiosity, checked to see how well I’d recalled it when rewriting. Weirdly, I discovered that the remembered version was actually an improvement on the one that Graham and I had written. This puzzled the hell out of me.

Again I was forced to the conclusion that my mind must have continued to think about the sketch after Graham and I had finished it. And that my mind had been improving what we’d written, without my making any conscious attempt to do so. So when I remembered it, it was already better.

Chewing this over, I realised it was like the tip-of-the-tongue phenomenon: when you can’t remember a name, and you chase after it in your mind

Excerpt from: Creativity: A Short and Cheerful Guide by John Cleese

πŸ’Ž Kleiner Perkin’s tactic for avoiding their staff developing entrenched positions in meetings (flip-flop)

Another renowned venture capitalist, Kleiner Perkins’s Randy Komisar takes this idea one step further. He dissuades members of the investment committee from expressing firm opinions by stating right away that they are for or against an investment idea. Instead, Komisar asks participants for a β€œbalance sheet” of points for and against the investment: β€œTell me what is good about this opportunity; tell me what is bad about it. Do not tell me your judgment yet. I don’t want to know.” Conventional wisdom dictates that everyone should have an opinion and make it clear. Instead, Komisar asks his colleagues to flip-flop!

Excerpt from: You’re About to Make a Terrible Mistake!: How Biases Distort Decision-Making and What You Can Do to Fight Them by Olivier Sibony

πŸ’Ž Analysing successful brands can be misleading (survivorship bias)

The models whose success we admire are, by definition, those who have succeeded. But out of all the people who were “crazy enough to think they can change the world,” the vast majority did not manage to do it. For this very reason, we’ve never heard of them. We forget this when we focus only on the winners. We look only at the survivors, not at all those who took the same risks, adopted the same behaviors, and failed. This logical error is survivorship bias. We shouldn’t draw any conclusions from a sample that is composed only of survivors. Yet we do, because they are the only ones we see.

Our quest for models may inspire us, but it can also lead us astray. We would benefit from restraining our aspirations and learning from people who are similar to us, from decision makers whose success is less flashy, instead of a few idols

Excerpt from: You’re About to Make a Terrible Mistake!: How Biases Distort Decision-Making and What You Can Do to Fight Them by Olivier Sibony

πŸ’Ž On why partial knowledge is often victorious over full knowledge (it conceives things as simpler than they are)

Such misleading stories, however, may still be influential and durable. In Human, All Too Human, philosopher Friedrich Nietzsche argues that β€œpartial knowledge is more often victorious than full knowledge: it conceives things as simpler than they are and therefore makes its opinion easier to grasp and more persuasive.”

Excerpt from: The Myth of Experience: Why We Learn the Wrong Lessons, and Ways to Correct Them by Emre Soyer and Robin M Hogarth

πŸ’Ž On the danger of a theory free analysis of mere correlations (winter detector)

The ‘winter detector’ problem is common in big data analysis. A literal example, via computer scientist Sameer Singh, is the pattern-recognising algorithm that was shown many photos of wolves in the wild, and many photos of pet husky dogs. The algorithm seemed to be really good at distinguishing the two rather similar canines; it turned out that it was simply labelling any picture with snow as containing a wolf. An example with more serious implications was described by Janelle Shane in her book You Look Like a Thing and I Love You: an algorithm that was shown pictures of healthy skin and of skin cancer. The algorithm figured out the pattern: if there was a ruler in the photograph, it was cancer. If we don’t know why the algorithm is doing what it’s doing, we’re trusting our lives to a ruler detector.

Excerpt from: How to Make the World Add Up: Ten Rules for Thinking Differently About Numbers by Tim Harford

πŸ’Ž On the lack of data proving the effectiveness of ad campaigns (designed to boost loyalty)

The advertising industry – whose only important asset is ideas – has learned nothing from this. We keep heading in the wrong direction. We keep bulking up everything in our arsenal except our creative resources. Then we take the people who are supposed to be our idea people and give them till 3 o’clock to do a banner.

Sure, we need people who are tech-savvy and analytical. But more than anything, we need some brains-in-a-bottle who have no responsibility other than to sit in a corner and feed us crazy ideas. We keep looking to β€œtransform” our industry but ignore the one transformation that would kill.

Excerpt from:Β How not to Plan: 66 ways to screw it up by Les Binet and Sarah Carter

πŸ’Ž Five pronged model for encouraging behaviour change (reduce)


When pushed, people push back. So rather than telling people what to do, or trying to persuade, catalysts allow for agency and encourage people to convince themselves.


People are attached to the status quo. To ease endowment, catalysts surface the costs of inaction and help people realize that doing nothing isn’t as costless as it seems.

Too far from their backyard, people tend to disregard. Perspectives that are too far away fall in the region of rejection and get discounted, so catalysts shrink distance, asking for less and switching the field.


Seeds of doubt slow the winds of change. To get people to un-pause, catalysts alleviate uncertainty. Easier to try means more likely to buy.


Some things need more proof. Catalysts find corroborating evidence, using multiple sources to help overcome the translation problem.

Excerpt from: Catalyst by Jonah Berger

πŸ’Ž Six psychological biases that help explain why we fail to prepare for disasters

1. Myopia: a tendency to focus on overly short future time horizons when appraising immediate costs and the potential benefits of protective investments;
2. Amnesia: a tendency to forget too quickly the lessons of past disasters;
3. Optimism: a tendency to underestimate the likelihood that losses will occur from future hazards;
4. Inertia: a tendency to maintain the status quo or adopt a default option when there is uncertainty about the potential benefits of investing in alternative protective measures:
5. Simplification: a tendency to selectively attend to on subset of the relevant factors to consider when making choices involving risk; and
6. Herding: a tendency to base choices on the observed actions of others.

Excerpt from: The Ostrich Paradox: Why We Underprepare for Disasters by Robert Meyer and Howard Kunreuther

πŸ’Ž The ten steps for a successful behavioural science intervention

1. Establish the scope.
2. Break the challenge into addressable parts.
3. Identify the target outcome.
4. Map the relevant behaviors.
5. Identify the factors that affect each behavior.
6. Choose the priority behaviors to address.
7.Create evidence-led intervention(s).
8. Implement the intervention(s).
9. Assess the effects.
10. Take further action based on the results

Excerpt from: Behavioural Insights by Michael Hallsworth

πŸ’Ž Three ideas from psychology that explain why brainstorms tend to be ineffective (from social loafing to production blocking)

Research shows there are many psychological processes at work which together limit the effectiveness of brainstorming. ‘Social loafing’ – a group situation encourages and allows individuals to slack off. ‘Evaluation apprehension’ – we’re nervous of being judged by colleagues or looking stupid. ‘Production blocking’ – because only one person can speak at a time in a group, others can forget or reject their ideas while they wait. We’re also learning more about the power of our “herd’ tendencies. As humans, we have innate desires to conform to others with only the slightest encouragement. When asked to think creatively, these implicit norms are invisible but powerful shackles on our ability to think differently.

No wonder so few ideas emerge.

Excerpt from:Β How not to Plan: 66 ways to screw it up by Les Binet and Sarah Carter

πŸ’Ž On the danger of statistical methods being used to control the world (rather than understand it)

Social scientists have long understood that statistical metrics are at their most pernicious when they are being used to control the world, rather than try to understand it. Economists tend to cite their colleague Charles Goodhart, who wrote in 1975: ‘Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes. (Or, more pithily: ‘When a measure becomes a target, it ceases to be a good measure.’) Psychologists turn to Donald T. Campbell, who around the same time explained: β€œThe more any quantitative social indicator is used for social decision-making, the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor.

Goodhart and Campbell were on to the same basic problem: a statistical metric may be a pretty decent proxy for something that really matters, but it is almost always a proxy rather than the real thing.

Excerpt from: How to Make the World Add Up: Ten Rules for Thinking Differently About Numbers by Tim Harford

πŸ’Ž There is no such thing as a wholly original idea (but there is such a thing as unique combinations)

It is to be found in the exceptional human capacity to synthesize our experiences, influences, knowledge and feelings into one, unified, original entity. To have such an inbuilt facility that enables us to make seemingly random connections across a broad It has to be the single most important creative faculty we have, as Einstein observed when he said, β€œCombinatory play seems to be the essential feature in productive thought.’

The process our conscious and unconscious selves go through when editing, connecting and combining all that we know and feel into an original coherent thought happens over a period of time. It cannot be forced. It happens when we are awake and when we are asleep. It happens when we are thinking about something else entirely, or playing a game of tennis. It happens because a stimulus in our immediate surroundings – usually without our knowing

Excerpt from: Think Like an Artist: . . . and Lead a More Creative, Productive Life by Will Gompertz

πŸ’Ž Too often our brain works like a lawyer (it will find arguments to defend our convictions whatever the cost)

The explanation for Kahan’s results? Ideology. Irrespective of the actual figures, Democrats who identified as liberal, normally in favour of gun control, tended to find that stricter laws brought crime down. For the conservative Republican participants, the reverse was the case. They found that stricter gun control legislation did not work.

These answers are no longer to do with the truth, Kahan argued. They are about protecting your identity or belonging to your tribe! And the people who were good at maths, Kahan also found, were all the better at this. Often completely subconsciously, by the way. It was their psyche that played tricks on them.

Excerpt from: The Number Bias: How Numbers Lead and Mislead Us by Sanne Blauw

πŸ’Ž Stacking can make the adoption of new habits easier (like flossing)

In evidence that stacking works, consider dental floss. Many of us clean our teeth regularly but fail to floss. To test whether stacking increases flossing, researchers gave fifty British participants, who flossed on average only 1.5 times per month, information encourage them to do it more regularly.

Half of the participants were told to floss before they brushed at night, and half after they brushed. Note that only half of the participants were really stacking-using an existing automated response (brushing their teeth) as a cue for a new behavior (flossing). The other half, who first flossed and then brushed, had to remember, oh, yes, first I need to floss, before I brush. No automated cue.

Each day for four weeks, participants reported by text whether they flossed the night before. At the end of the month of reminders, they all flossed about twenty-four days on average. Most interesting is what they were all doing eight months later. Those who stacked, and flossed after they brushed, were still doing it about eleven days a month. For them, the new behavior was maintained by the existing habit. The group originally instructed to floss before they brushed ended up doing it only about once a week.

Excerpt from: Good Habits, Bad Habits: The Science of Making Positive Changes That Stick by Wendy Wood

πŸ’Ž The paradox of progress (and the paradox of choice)

The Paradox of progress, and the paradox of choice: There is a familiar story of a New York banker vacationing in Greece, who, from talking to a fisherman and scrutinizing the fisherman’s business, comes up with a scheme to help the fisherman make it a big business. The fisherman asked him what the benefits were; the banker answered that he could make a pile of money in New York and come back to vacation in Greece; something that seemed ludicrous to the fisherman, who was already there doing the kind of things bankers do when they go on vacation in Greece.

The story was well known in antiquity, under a more elegant form, as retold by Montaigne (my translation): When King Pyrrhus tried to cross into Italy, CynΓ©as, his wise adviser, tried to make him feel the vanity of such action. “To what end are you going into such enterprise?” he asked. And Pyrrhus answered, “To make myself the master of Italy.” CynΓ©as: “Then ?” Pyrrhus: To conquer Africa, then … come rest at ease.” CynΓ©as: But you are already there; why take more risks?”

Excerpt from: Skin in the Game: Hidden Asymmetries in Daily Life by Nassim Nicholas Taleb

πŸ’Ž The premortem vs. postmortem (prevention vs. cure)

The premortem – when an organisation has almost come to an important decision but hasn’t formally committed itself, the decision makers gather for a brief session.

They are asked to imagine that it is one year later and that the idea has been a complete disaster.

They then have to write a short history of what happened.

The premortem can prevent many a disaster.

By contrast, a postmortem is always too late.

It’s all about timing.

Excerpt from: The Excellence Book: 50 Ways to be Your Best (Concise Advice) by Kevin Duncan

πŸ’Ž The Myers-Briggs test (a marriage test used by Fortune 500 companies)

Initially, Briggs had designed her questionnaire to identify solid marriage partners, but after the Second World War, her daughter repositioned it to place people in the right jobs. The MBTI does what all profiling systems do: asks batteries of questions and organises the answers into types which are supposed to define your personality. And yet the test has no basis in clinical psychology, though it is deployed by most Fortune 500 companies, many universities, schools, churches, consulting companies, the CIA, the army and the navy.

The MBTI test-retest validity lies below statistical significance, meaning that if you test someone more than once, you are likely to get different results. More worrying is that the questionnaire poses binary questions, asking, for example, whether you value sentiment more than logic or vice versa. The question assumes that there is a simple answer to this question, absent of context. Yet in real life, preference is highly contextual: I value logic when purchasing car insurance; I may value sentiment more when choosing to play with my son. Binaries always simplify, often to the point of absurdity, and they polarise what are often complements.

Excerpt from: Uncharted: How to Map the Future by Margaret Heffernan

πŸ’Ž The more we think about an event, the more we think it’s likely to happen (often wrongly)

One of the earliest experiments examining the power of imagination to sway intuition was conducted during the U.S. presidential election campaign of 1976. One group was asked to imagine Gerald Ford winning the election and taking the oath of office, and then they were asked how likely it was that Ford would win the election. Another group was asked to do the same for Jimmy Carter. So who was more likely to win? Most people in the group that imagined Ford winning said Ford. Those who saw Jimmy Carter taking the oath said Carter. Later experiments have obtained similar results. What are your odds of being arrested? How likely is it you’ll win the lottery? People who imagine the event consistently feel the odds or the event actually happening are higher than those who don’t.

Excerpt from: Risk: The Science and Politics of Fear by Dan Gardner

πŸ’Ž The labour illusion (or why incompetence sometimes pays)

But would he have preferred that the locksmith bumble around, take a long time and fake effort? Well, maybe. A locksmith once told Dan that when he started his career, he took forever to open a lock, and in the process, he often broke it, taking even more time and money to get one properly installed and finish the job. He charged for the parts to replace the broken lock as well as his standard fee for opening a locked door. People were happy to pay all this, and they tipped him well. He noticed, however, that as he became proficient and opened a lock quickly, without breaking the old lock (and without the consequent need to replace it and charge his clients for the extra parts), customers not only didn’t tip, but they also argued about his fee.

Wait, what? How much is it worth to have our door open? That should be the question. But because it’s difficult to put a price on this, we look at how much effort it takes to have that door unlocked. When there’s a great deal of effort, we feel much better about paying more. But all that should matter is the value of that open door.

Excerpt from: Small Change: Money Mishaps and How to Avoid Them by Dan Ariely and Jeff Kreisler

πŸ’Ž On the puzzling fact that only so few ad agency staff are over 50 (unlike other creative industries)

We’ll start with the Nobel Prize. There is only one Nobel Prize in a creative field. It is the prize for Literature. It went to Kazuo Ishiguro who is 64.

The Pulitzer Prize is awarded in several creative fields. The Pulitzer for Drama went to Lynn Nottage who is 54. The Pulitzer for History went to Heather Ann Thompson, age 55. The Pulitzer for Poetry went to Tyehimba Jess, age 53.

Next we move to television. The Emmy for Best Drama Series went to The Handmaid’s Tale. The novel was written by Margaret Atwood who was 79 and was creative consultant on the show. The Best Comedy Series went to Veep, executive produced by Julia Louis-Dreyfus, 57. She also won for Best Actress. Best Limited Series went to Big Little Lies created by David E Kelley, 62. The Best Supporting Actress was Ann Dowd, 62. Best Supporting Actor was John Lithgow, 73. Best Supporting Actor in a Comedy Series went to Alec Baldwin, 60.

So, let’s recap. People over 50 are creative enough to dominate in Nobels, Pulitzers, Oscars, and Emmys but are not creative enough to write a fucking banner ad. I guarantee you, not one of these brilliantly talented people could get a job in an ad agency today. Not one.

Excerpt from: Advertising for Skeptics by Bob Hoffman

πŸ’Ž On the need for businesses to be resilient (so they’re not blown away)

“Wind extinguishes a candle and energizes fire. You want to be the fire and wish for the wind”

Nassim Taleb

Excerpt from: The Science of Organizational Change: How Leaders Set Strategy, Change Behavior, and Create an Agile Culture (Leading Change in the Digital Age) by Paul Gibbons

πŸ’Ž Why it becomes harder to predict technological change (as technology develops)

Perhaps we should have seen this acceleration coming. In the 1930s an American aeronautical engineer named T. P Wright carefully observed aeroplane factories at work. He published research demonstrating that the more often a particular type of aeroplane was assembled, the quicker and cheaper the next unit became. Workers would gain experience, specialised tools would be developed, and ways to save time and material would be discovered. Wright reckoned that every time accumulated production doubled, unit costs would fall by 15 per cent. He called this phenomenon ‘the learning curve’.

Three decades later, management consultants at Boston Consulting Group, or BCG, rediscovered Wright’s rule of thumb in the case of semiconductors, and then other products too. Recently, a group of economists and mathematicians at Oxford University found convincing evidence of learning curve effects across more than 50 different products from transistors to beer – including photovoltaic cells. Sometimes the learning curve is shallow and sometimes steep, but it always seems to be there.

The learning curve may be a dependable fact about technology, but paradoxically, it creates a feedback loop that makes it harder to predict technological change. Popular products become cheap; cheaper products become popular.

Excerpt from: The Next Fifty Things that Made the Modern Economy by Tim Harford

πŸ’Ž 9 suggested research techniques for planners

Start Doing Research Differently

1. Don’t only talk to the consumer. Talk to someone who spends their life understanding the target. Wife, kids, boss, subordinate, neighbor, garbage man, probation officer
2. Send them a disposable camera and a one time brief
3. Get them to write something and word cloud it
4. Set up a video confessional booth
5. People love playing marketer. Give them your job
6. Think of the rote thing to do. Do the opposite
7. Get 10 smart people to write 10 Onion headlines for your brand or category
8. Go to their house as a forensic criminologist
9. Pitch ideas like this at your account people until you give them one that makes them think you’re insane. Then do that one

Excerpt from: Strategy Scrapbook by Alex Morris

πŸ’Ž One problem with the client-agency model is that it encourages complex clever answers (rather than simple, effective ones)

As we saw, a bureaucratized system will increase in complication from the interventionism of people who sell complicated solutions because that’s what their position and training invite them to do.

Things designed by people without skin in the game tend to grow in complication (before their final collapse).

There is absolutely no benefit for someone in such a position to propose something simple: when you are rewarded for perception, not results, you need to show sophistication. Anyone who has submitted a Scholarly paper to a journal knows that you usually raise the odds of acceptance by making it more complicated than necessary.

Excerpt from: Skin in the Game: Hidden Asymmetries in Daily Life by Nassim Nicholas Taleb

πŸ’Ž How reform of organisations often requires an outside perspective (unfamiliarity with the department)

β€˜It should be remembered, that in few departments have important reforms been effected by those trained up in practical familiarity with their details. The men to detect blemishes and defects are among those who have not, by long familiarity, been made insensible to them.’

Excerpt from: The Next Fifty Things that Made the Modern Economy by Tim Harford

πŸ’Ž With uncertainty we prefer conformity (group think)

Crutchfield’s experiment involved slightly more ambiguous questions, including one in which people were asked if they agreed with the statement ‘I believe we are made better by the trials and hardships of life.’ Among subjects in a control group that was not exposed to the answers of others, everyone agreed. But among those in the experiment who thought that everyone else disagreed with the statement, 31 per cent said they did not agree. Asked whether they agreed with the statement ‘I doubt whether I would make a good leader,’ every person in the control group rejected it. But when the group was seen to agree with the statement, 37 per cent of people went along with the consensus and agreed that they doubted themselves.

Excerpt from: Risk: The Science and Politics of Fear by Dan Gardner

πŸ’Ž How our perception of risk is skewed by β€œwhat makes a good story or hypothesis”(rather than a cold calculation of the odds)

Many other studies produced similar results. Kahneman and Tversky divided 245 undergrads at the University of British Columbia in half and asked one group to estimate the probability of a massive flood somewhere in North America in 1983, in which more than 1,000 people drown.’ The second group was asked about an earthquake in California sometime in 1983, causing a flood in which more than 1,000 people drown.’ Once again, the second scenario logically has to be less likely than the first but people rated it one-third more likely than the first. Nothing says ‘California’ quite like β€˜earthquake’.

Excerpt from: Risk: The Science and Politics of Fear by Dan Gardner

πŸ’Ž Origin of Lacoste (le crocodile)

RenΓ© Lacoste, French tennis star, earned the nickname ‘le crocodile’ for winning a crocodile-skin suitcase in a bet. ‘A friend drew a crocodile’, he said, ‘and / had it embroidered on the blazer / wore on the courts!’ His polo shirts were launched in 1933 and are probably the first example of sportswear as fashion.

Excerpt from: The Art of Looking Sideways by Alan Fletcher