πŸ’Ž On how we evaluate ourselves by comparing to known references (cognitive dissonance)

Leon Festinger and Merrill Carlsmith of Stanford University once asked their students to carry out an hour of excruciatingly boring tasks. They then divided the subjects into two groups. Each student in group A received a dollar (it was 1959) and instructions to wax lyrical about the work to another student waiting outside – in other words, to lie. The same was asked of the students in group B, with one difference: they were given $20 for the task. Later, the students had to divulge how they had really found the monotonous work. Interestingly, those who received only a dollar rated it as significantly more enjoyable and interesting. Why? One measly dollar was not enough for them to lie outright; instead they convinced themselves that the work was not that bad. Just as Aesop’s fox reinterpreted the situation, so did they. The students who received more didn’t have to justify anything. They had lied and netted $20 for it – a fair deal. They experienced no cognitive dissonance.

Excerpt from: The Art of Thinking Clearly by Rolf Dobelli

πŸ’Ž On the power of anchoring (for items that are hard to value)

Another experiment: students and professional real-estate agents were given a tour of a house and asked to estimate its value. Beforehand, they were informed about a (randomly generated) listed sales price. As might be expected, the anchor influenced the students: the higher this price, the higher they valued the property. And the professionals? Did they value the house objectively? No, they were similarly influenced by the random anchor amount. The more uncertain the value of something – such as real estate, company stock or art – the more susceptible even experts are to anchors.

Excerpt from: The Art of Thinking Clearly by Rolf Dobelli

πŸ’Ž On how quickly in groups form (even when the connection is obscure)

With sports affiliations, random birthplace suffices, and in business it is where you work. To test this, the British psychologist Henri Tajfel split strangers into groups, tossing a coin to choose who went to which group. He told the members of one group it was because they all liked a particular type of art. The results were impressive: although A) they were strangers, B) they were allocated a group at random and C) they were far from art connoisseurs, the group members found each other more agreeable than members of other groups.

Excerpt from: The Art of Thinking Clearly by Rolf Dobelli

πŸ’Ž On the difference between knowing the name of something and knowing something (like birds)

Richard Feynman: β€œYou can know the name of a bird in all the languages of the world, but when you’re finished, you’ll know absolutely nothing whatever about the bird…So let’s look at the bird and see what it’s doingβ€”that’s what counts. I learned very early the difference between knowing the name of something and knowing something.”

Excerpt from: The Art of the Good Life: Clear Thinking for Business and a Better Life by Rolf Dobelli

πŸ’Ž On the inaccuracy of forecasters (those who don’t know they don’t know)

Every day, experts bombard us with predictions, but how reliable are they? Until a few years ago, no one bothered to check. Then along came Philip Tetlock. Over a period of ten years, he evaluated 28,361 predictions from 284 self-appointed professionals. The result: in terms of accuracy, the experts fared only marginally better than a random forecast generator. Ironically, the media darlings were among the poorest performers; and of those the worst were the prophets of doom and disintegration. Examples of their far-fetched forecasts included the collapse of Canada, Nigeria, China, India, Indonesia, South Africa, Belgium and the E.U. None of these countries has imploded.

β€˜There are two kinds of forecasters: those who don’t know, and those who don’t know they don’t know,’ wrote Harvard economist J.K.Galbraith.

Excerpt from: The Art of Thinking Clearly by Rolf Dobelli

πŸ’Ž On staying within your circle of competence (when not to bet)

In his far from mediocre book Risk Intelligence, Dylan Evans describes a professional backgammon player by the name of J.P. “He would make a few deliberate mistakes to see how well his opponent would exploit them. If the other guy played well, J.P. would stop playing. That way, he wouldn’t throw good money after bad. In other words, J.P. know something that most gamblers don’t: he knew when not to bet.” He knew which opponents would force him out of his circle of competence, and he learned to avoid them.

Excerpt from: The Art of the Good Life: Clear Thinking for Business and a Better Life by Rolf Dobelli

πŸ’Ž On the Knowledge Illusion (how do zippers work?)

How does a zipper work? Rate your understanding on a scale from 0 (no clue) to 10 (easy-peasy). Write the number down. Now sketch out on a piece of paper how a zipper actually works. Add a brief description, as through you were trying to explain it very precisely to someone who’d never seen a zipper before. Give yourself a couple of minutes. Finished? Now reassess your understanding of zippers on the same scale.

Leonid Rozenblit and Frank Keil, researchers at Yale University confronted hundreds of people with equally simple questions. How does a toilet work? How does a battery work? The results are always the same: we think we understand these things reasonably well until we’re force to explain them. Only then do we appreciate how many gaps there are in our knowledge You’re probably similar. You were convinced you understood more than you actually did. That’s the knowledge illusion.

Excerpt from: The Art of the Good Life: Clear Thinking for Business and a Better Life by Rolf Dobelli

πŸ’Ž On how higher prices increase joy but not happiness (think about your car)

How much pleasure do you get from your car? Put it on a scale from 0 to 10. If you don’t own a car, then do the same for your house, your flat, your laptop, anything like that. Psychologists Norbert Schwarz, Daniel Kahneman and Jing Xu asked motorists this question and compared their responses with the monetary value of the vehicle. The result? The more luxurious the car, the more pleasure it gave the owner. A BMW 7 Series generates about fifty percent more pleasure than a Ford Escort. So far, so good: when somebody sinks a load f money in a vehicle, at least they felt a good return on their investment in the form of joy.

Now, let’s ask a slightly different question: how happy were you during your last car trip? The researchers posed the question too, and again compared the motorists’ answers with values of their cars. The result? No correlation. No matter how luxurious or how shabby the vehicle, the owners’ happiness ratings were all equally rock bottom.

Excerpt from: The Art of the Good Life: Clear Thinking for Business and a Better Life by Rolf Dobelli

πŸ’Ž On confusing chauffeur knowledge with real knowledge (the danger of surface knowledge)

After receiving the Nobel Prize for Physics in 1918, Max Planck went on tour across Germany. Wherever he was invited, he delivered the same lecture on new quantum mechanics. Over time, his chauffeur grew to know it by heart: ‘It has to be boring giving the same speech each time, Professor Planck. How about I do it for you in Munich? You can sit in the front row and wear my chauffeur’s cap. That’d give us both a bit of variety.’ Planck liked the idea, so that evening the driver held a long lecture on quantum mechanics in front of a distinguished audience. Later, a physics professor stood up with a question. The driver recoiled: ‘Never would I have thought that someone from such an advanced city as Munich would ask such a simple question! My chauffeur will answer it.’

Excerpt from: The Art of Thinking Clearly by Rolf Dobelli

πŸ’Ž On why we should seek out disconfirming evidence when formulating a theory (confirmation bias)

No professionals suffer more from the confirmation bias than business journalists. Often, they formulate an easy theory, pad it out with two or three pieces of ‘evidence’ and call it a day. For example: “Google is so successful because the company nurtures a culture of creativity.” Once the idea is on paper, the journalist corroborates it by mentioning a few other prosperous companies that foster ingenuity. Rarely does the writer seek out disconfirming evidence, which in this instance would be struggling businesses that live and breathe creativity or, conversely, flourishing firms that are utterly uncreative. Both groups have plenty of members, but the journalist simply ignores them. If he or she were to mention just one, the storyline would be ruined.

Excerpt from: The Art of Thinking Clearly by Rolf Dobelli

πŸ’Ž On analysing successful brands and looking for a recipe for success (is often misleading)

A quick hypothesis: say one million monkeys speculate on the stock market. They buy and sell stocks like crazy, and, of course, completely at random. What happens? After one week, about half of the monkeys will have made a profit and the other half a loss. The ones that made a profit can stay; the ones that made a loss you send home. In the second week, one half of the monkeys will still be riding high, while the other half will have made a loss and are sent home. And so on. After ten weeks, about 1,000 monkeys will be left — those who have always invested their money well. After twenty weeks, just one monkey will remain — this one always, without fail, chose the right stocks and is now a billionaire. Lets call him the success monkey.

How does the media react? They will pounce on this animal to understand its “success principles”. And they will find some: perhaps the monkey eats more bananas than the others. Perhaps he sits in another corner of the cage. Or, maybe he swings headlong through the branches, or he takes long, reflective pause while grooming. He must have some recipe for success, right? How else could he perform so brilliantly? Spot-on for twenty weeks — and that from a simple money? Impossible!

Also known as: Outcome Bias.

Excerpt from: The Art of Thinking Clearly by Rolf Dobelli

πŸ’Ž On the power of loss aversion in healthcare (increased awareness)

For this reason, if you want to convince someone about something, don’t focus on the advantages; instead highlight how it helps them dodge the disadvantages. Here is an example from a campaign promotion breast self-examination (BSE): two different leaflets were handed out to women. Pamphlet A urged: “Research shows that women who do BSE have an increased change of finding a tumour in the early, non treatable stage of the disease”. Pamphlet B said: “Research shows that women who do not do BSE have a decreased chance of finding a tumour in the early, more treatable stage of the disease.: The study revealed that pamphlet B (written in a “loss-frame”) generated significantly more awareness and BSE behaviour than pamphlet A (written in “gain-frame”).

Excerpt from: The Art of Thinking Clearly by Rolf Dobelli

πŸ’Ž On giving the story a face (statistics don’t stir us, people do)

In another experiment, psychologist Paul Slovic asked people for donations. One group was shown a photo of Rokia from Malawi, an emaciated child with pleading eyes. Afterward, people donated an average of $2.83 to the charity (out of $5 they were given to fill out a short survey). The second group was shown statistics about the famine in Malawi, including the fact that more than three million malnourished children were affected, The average donation dropped by 50%. This is illogical: you would think that people’s generosity would grow if they know the extent of the disaster. But we do not function like that. Statistics don’t stir us; people do.

The media have long known that factual reports and bar charts do not entice readers. Hence the guideline: give the story a face.

Excerpt from: The Art of Thinking Clearly by Rolf Dobelli

πŸ’Ž On how poorly set targets lead to unintended consequences (Dead Sea scrolls to company boards)

In 1947, when the Dead Sea scrolls were discovered, archaeologists set a finder’s fee for each new parchment. Instead of lots of extra scrolls being found, they were simply torn apart to increase the reward. Similarly, in China in the nineteenth century, an incentive was offered for finding dinosaur bones. Farmers located a few on their land, broke them into pieces and cashed in. Modern incentives are no better: company boards promise bonuses for achieved targets. And what happens? Managers invest more energy in trying to lower the targets than in growing the business.

Excerpt from:Β The Art of Thinking Clearly by Rolf Dobelli

πŸ’Ž On how we tend to overestimate the number of people who share our views (we like to think we’re in the popular majority)

Stanford psychologist Lee Ross hit upon this in 1977. He fashioned a sandwich board emblazoned with the slogan β€˜Eat at Joe’s’ and asked randomly selected students to wear it around campus for thirty minutes. They also had to estimate how many other students would put themselves forward for the task. Those who declared themselves willing to wear the sign assumed that the majority (62%) would also agree to it. On the other hand, those who politely refused believed that most people (67%) would find it too stupid to undertake. In both cases, the students imagined themselves to be in the popular majority.

Excerpt from:Β The Art of Thinking Clearly by Rolf Dobelli

πŸ’Ž On social loafing reducing the effectiveness of teams (why teams are lazy)

In 1913 Maximilian Ringelmann, a French engineer, studied the performance of horses. He concluded that the power of two animals pulling a coach did not equal twice the power of a single horse. Surprised by this result, he extended his research to humans. He had several men pull a rope and measured the force applied by each individual. On average, if two people were pulling together, each invested just 93% of their individual strength, when three pulled together, it was 85%, and with eight people, just 49%.

Excerpt from:Β The Art of Thinking Clearly by Rolf Dobelli

πŸ’Ž On how statistics lack emotional impact when compared to images (numbers versus coffins)

For eighteen years, the American media was prohibited from showing photographs of fallen soldiers’ coffins. In February 2009, defence secretary Robert Gates lifted this ban and images flooded on to the Internet. Officially, family members have to give their approval before anything is published, but such a rule is unenforceable. Why was this ban created in the first place? To conceal the true costs of war. We can easily find out the number of casualties, but statistics leave us cold. People, on the other hand, especially dead people, spark an emotional reaction.

Excerpt from: The Art of Thinking Clearly by Rolf Dobelli

πŸ’Ž On the importance of getting a representative sample (telephone survey to assess telephone ownership)

Particularly amusing is this recent telephone survey: a company wanted to find out, on average, how many phones (landline and cell) each household owned. When the results were tallied, the firm was amazed that not a single household claimed to have no phone. What a masterpiece.

Excerpt from: The Art of Thinking Clearly by Rolf Dobelli

πŸ’Ž On messages from untrustworthy sources still having an impact (why propaganda works)

Amazingly, just the opposite is true for propaganda. If it strikes a chord with someone, this influence will only increase over time. Why? Psychologist Carl Hovland, who led the study for the war department, named this phenomenon the sleeper effect. To date, the best explanation is that, in our memories, the source of the argument fades faster than the argument. In other words, your brain quickly forgets where the information came from (e.g. from the department of propaganda). Meanwhile, the message itself (i.e., war is necessary and noble) fades only slowly or even endures. Therefore, any knowledge that stems from an untrustworthy source gains credibility over time. The discrediting force melts away faster than the message does.

Excerpt from: The Art of Thinking Clearly by Rolf Dobelli