It’s important we look at how and why we make decisions – for example why we choose to pursue one idea over another. One way of doing this is exploring how we view our world through ‘mental models’, which provide a convenient way of making sense of experience.
Mental models are the brain’s way of encoding the vast amounts of information we have to process every second of every day. It is thought that one part of the brain absorbs the raw material, and rapidly switches it to the storage areas as memory. Then, when information is required, it is retrieved in the form of memory ‘modules’. This is of course a simplification of a very complex process, but it serves to demonstrate why our experience seems structured rather than random.
Most of the time, this works efficiently, with the models providing a pre-digested picture of the world, and a rapid retrieval system to make sense of this picture. The problems arise when this system is challenged. Information that ‘fits the picture’ can easily be used and built upon. But information that doesn’t fit needs to be dealt with, and the quickest solution is to reject or distort it. This is known as confirmation bias.
From the earliest days of psychoanalysis Sigmund Freud, himself a neurologist by training, was fascinated by what he termed denial, projection and distortion (and some variations on these themes), and the anxiety caused by having one’s views challenged. Although he could observe this phenomenon, he of course lacked the tools now available to fully understand its causes.
Confirmation bias is powerful and pervasive. It’s the default position (‘you do agree, don’t you?’), and resists challenge. This is why individuals often behave irrationally when confronted with evidence that does not conform to their world-view.
English psychologist Peter Wason coined the term in the 1960s. He conducted research presenting three numbers: 2,4,6, and asked participants to discover the rule governing the sequence through a series of guesses. This research has been repeated a number of times, with the same outcome. Consistently, participants believe they have ‘got it’ after only four or five guesses, and they are almost invariably wrong. This is because they immediately decide what the rule is, then only ask questions that confirm their hypothesis. So they assume the rule is ‘increments of two’, or ‘rising even numbers’. In fact the rule is simply ‘rising numbers’, which could be tested by asking if odd numbers fit the rules.
Does this matter? The answer is an emphatic Yes, because in scientific experiments, much research fails when ‘inconvenient’ hypotheses are often ignored or rejected as they don’t fit preconceived views. A large McKinsey study of business investments showed that when they worked at reducing the effect of bias in their decision-making processes, they experienced much better results.
How do you overcome the influence of confirmation bias? Some 500 years ago, Leonardo da Vinci advocated developing a ‘tolerance of ambiguity’, and devised a number of exercises to help himself see things differently. Twentieth century experts spoke of ‘making the strange familiar, and the familiar strange’. Using techniques such as mindfulness, it becomes possible first to notice, then to adjust, the tendency to confirmation bias.
Taking a more modern example, investor Warren Buffett invites critics to openly question his investment decisions. Some investors also imagine that their stock portfolio has completely collapsed, and ask themselves why this might happen. It takes humility and bravery to adopt this approach, but it’s one of the surest ways of ensuring you have a complete analysis of the issue at hand.
Be vigilant. Being aware of and avoiding confirmation bias requires regular effort, and it’s easy to slip back into old habits of thought. You do agree, don’t you?