One stark and surprising study nicely illustrates the direction of such research. Conducted by Israeli Danziger of Ben-Gurion University, the study looks into the parole decisions of Israeli judges. The researchers examined 1,112 parole hearings by Israeli judges, and observed a startling trend: at the start of the day, around two-thirds of people before the court were given parole. Before the lunch break, this fell almost to nothing – but straight after a break, this once again jumped above 60 per cent again. The pattern was repeated again through a second meal break.
About thirty years ago, an economist at the Bank of Israel named Michael Landsberger undertook a study of a group of Israelis who were receiving regular restitution payments from the West German government after World War II. Although these payments could without exaggeration be described as blood money – inasmuch as they were intended to make up for Nazi atrocities – they could also fairly accurately be described as found money. Because of this, and because the payments varied significantly in size from one individual or family to another, Landsberger was able to gauge the effect of the size of such windfalls on each recipients spending rate. What he discovered was amazing. The group of recipients who received the larger payments (which were equal to about two-thirds of their annual income) had a spending rate of about 0.23. In other words, for every dollar they received, their marginal spending increased by 23 percent; the rest was saved. Conversely, the group that received the smallest windfall payments (equal to about 7 percent of annual income) had a spending rate of 2. Or, more accurately, for every dollar of found money, they spent $1 of found money and another $1 from “savings” (what they actually saved or what they might have saved).
Half of the respondents, drawn from a national sample of relatively affluent households, were asked whether they could comfortably save 20 percent of their income, and the other half were asked whether they could comfortably live on 80 percent of their income. Of course, to save 20 percent of your income is to live on 80 percent of it. Nevertheless, whereas only half of the respondents thought they could save 20 percent of their earnings, four out of five thought they could comfortably live on 80 percent.
This difference in reactions cannot be chalked up to simple financial illiteracy. People find beef that is 80 percent lean more appealing than beef that is 20 percent fat. They are more impressed by condoms whose manufacturers boast of a 95 percent success rate rather than a 5 percent failure rate. They are more supportive of taxes on the rich when the existing level of income inequality is described in terms of how much more the rich earn than the median wage earner tan when it is described in terms of how much less the median wage earner earns than the rich.
With this in mind, in early 2000 the board of trustees of Ursinus College adopted a proposal designed to increase applications that at first might seem counterintuitive: The board raised tuition nearly 20 percent. The policy flew in the face of conventional economic theory, by which a drop in price is the surest way to increased demand. Unconventional or not, it worked: Applications soared. The strategy has been employed with equal success by a number of other colleges, including Bryn Mawr, Notre Dame, and Rice.
Although it is not what standard economic theory would recommend, it’s easy to see why raising tuition would increase the number of applicants. Parents want to send their kids to high-quality, prestigious schools. But academic quality and prestige are hard to assess, and so they use price as an indicator of quality. If it costs a lot, they tell themselves, it must be good.
It is natural — reflexive even — to see the causes of human action in the character and disposition of those doing the acting. But as George Eliot noted at the end of Middlemarch, “There is no creature whose inward being is so strong that it is not greatly determined by what lies outside it.”
In one telling study, research participant heard sentences with the first part of a key word omitted (which we indicate by “*”), and with different endings of the sentence presented to different participants. Thus, some participants heard “The *eel was on the axle,” and others heard “The *eel was on the orange”. In both cases, the participants reported hearing a coherent sentence – “The wheel was on the axle” in the first case and “The peel was on the orange” in the second – without ever consciously registering the gap. Nor did it register that they themselves had provided the “wh” or “p” they “heard” in order to make sense of the sentence.
As Harvard social psychologist Joshua Greene put it, “The best way to get people to do something is to tell them their neighbours are already doing it.”
Stanford psychologist Lee Ross hit upon this in 1977. He fashioned a sandwich board emblazoned with the slogan ‘Eat at Joe’s’ and asked randomly selected students to wear it around campus for thirty minutes. They also had to estimate how many other students would put themselves forward for the task. Those who declared themselves willing to wear the sign assumed that the majority (62%) would also agree to it. On the other hand, those who politely refused believed that most people (67%) would find it too stupid to undertake. In both cases, the students imagined themselves to be in the popular majority.
Excerpt from: The Art of Thinking Clearly by Rolf Dobelli
An especially common version of this phenomenon is what psychologists call denominator neglect. If you want people to be impressed by an amount, choose a large scale (“$365 a year”); if you want them not to be impressed, choose a small scale (“only a dollar a day”). The effects of strategically choosing the right scale (i.e., the right denominator) can be dramatic. In one study, respondents judged a disease that kills 1,200 out of every 10,000 afflicted individuals to be more dangerous than one that’s twice as lethal, killing 24 out of every 100.
And hundreds of surveys have shown that men and women of all ages, regions of the country, and even social classes tend to rate themselves as above average on almost any positive dimension: more sensitive than average, more unbiased, better leaders, better drivers—you name it. Perhaps the most remarkable finding is that even people who are in the hospital for injuries sustained in automobile accidents rate themselves on average as better than average drivers.
In a provocative exploration of this idea, nursery school children were asked to draw a picture with what was then a novel drawing tool: felt-tip markers. Some were offered a prize for drawing a picture with the markers; others received the prize unexpectedly, after they had already drawn their picture; still others were offered no incentive at all. When the markers were later introduced into a free play period, the children who had drawn a picture in order to get a reward played with them significantly less often that children who had not been “bribed” to draw their picture. In essence, the promise of a reward turned play into work. But when the prize was unexpected – when it was experienced not as a bribe but as a bonus – it did not decrease the children’s interest in playing with the markers.