My personal favourite from Festinger’s many brilliant examples of cognitive dissonance deals with people’s beliefs about the link between smoking and lung cancer. He was writing at the very birth of research on the causes of cancer. It was a unique window of time during which it was possible to test whether groups of smokers and non-smokers would accept or reject new information that had been uncovered about a link. Festinger saw the effects you’d expect from anyone suffering cognitive dissonance: heavy smokers – those who had the most to lose from the new research being right – were the most resistant to believing that a link had been proven; only 7 per cent accepted the validity of the new research. Twice as many moderate smokers accepted the link, at 16 per cent. Non-smokers were much more willing than smokers to believe the link had been proven, but as a mark of just how far social norms have swung since then, only 29 per cent of them believed the link had been proven, despite having nothing to lose.
The ‘Save More Tomorrow’ programme tackles both of these barriers head-on by, first, auto-enrolling people onto workplace saving schemes to combat inertia. People are obviously completely free to opt back out, but, human nature being what it is 90 per cent stay on the scheme, with their inertia now working for them rather than against them. An auto-escalator then ups the contributions, not immediately but over time (the tomorrow bit). This shifted the reluctant hugely: when asked whether they would up their contributions now by five percentage points, most said no (we need the chocolate now). But when asked whether they’d commit to saving more in the future, 78 per cent said yes.
The impact of ‘Save More Tomorrow’ has been substantial. Before the programme, the average saving rate for workers in the sample was 3-5 per cent, but after four years this had increased nearly four-fold to 13.6 per cent.
By behavioural psychologists Daniel Kahneman and Amos Tversky in 1973. In their classic experiment, they asked people to listen to a list of names and then recall whether there were more men or women on the list. Some people in the experiment were read a list of famous men and less famous women, while others were read the opposite. Afterwards, when quizzed by the researchers, individuals were more likely to say that there were more of the gender from the group with more famous names. Later researchers have linked this effect to how easily people could retrieve information: we tend to over-rely on what we can remember easily when coming to decisions or judgements.
There is a long tradition of attempting to test whether the truth changes people’s perceptions, both in academic and campaigning work, but the results remain mixed and inconclusive. Some studies show no impact at all on perceptions when we are told the correct figures, while others show some impact on certain beliefs, but not others. And some show more marked changes. In one more hopeful, recent example from a study in thirteen countries, the researchers split the group of respondents in two. They told one half some facts about actual immigration levels, and said nothing to the other half. Those armed with the correct information were less likely to say there were too many immigrants. However, on the other hand, they did not change their policy preferences: they were not more likely to support facilitating legal immigration. When the researchers went back to the same group four weeks later, the information had stuck for most – although so had the policy preferences. This fits with long-identified theories that facts struggle to cut through our partisan beliefs or our ‘perceptual screen’ as Angus Campbell and colleagues outlined in their classic book, The American Voter, back in 1960.