Chilcot: Why we cover our ears to the facts

  • Published
Tony Blair speaking at a press conference after the Chilcot report publicationImage source, Getty Images

Do people moderate their views when presented with overwhelming evidence to the contrary? Not necessarily, writes Matthew Syed.

We like to think that we apportion our beliefs to the evidence. After all, isn't this the hallmark of rationality? When information comes along which reveals we should change our minds, we do.

Or do we?

Consider an experiment, where two groups were recruited. One group was adamantly in favour of capital punishment. They had strong feelings on the issue, and had publicly backed the policy. The other group were adamantly against, horrified by "state-sanctioned murder".

These groups were then shown two dossiers. Each of these dossiers were impressive. They marshalled well-researched evidence. But here's the thing. The first dossier collated all the evidence in favour of capital punishment. The second collated all the evidence against.

Image source, iStock

Now you might suppose that, confronted by this contradictory evidence, the two groups would have concluded that capital punishment is a complex subject with arguments on both sides. You might have expected them to have moved a little closer in their views. In fact, the opposite happened - they became more polarised.

When asked about their attitudes afterwards, those in favour of capital punishment said they were impressed with the dossier citing evidence in line with their views. The data was rigorous, they said. It was extensive. It was robust. As for the other dossier - well, it was full of holes, shoddy, weak points everywhere.

The opposite conclusions were drawn by those against capital punishment. It was not just that they disagreed with the conclusions. They also found the (neutral) statistics unimpressive. From reading precisely the same material, they became even more entrenched in their positions.

What this (and dozens of other experiments) reveal is the way we filter new information when it challenges our strongly-held beliefs or judgements. We use a series of post hoc manoeuvres to reframe anything inconvenient to our original position. We question the probity of the evidence, or the credentials of the people who discovered it, or their motives, or whatever. The more information that emerges to challenge our perspective, the more creatively we search for new justifications, and the more entrenched we become in our prior view.

This tendency is called "cognitive dissonance".

Image source, iStock
Image caption,

When beliefs are challenged by evidence, people may become more entrenched in those beliefs

You can see the hallmarks of cognitive dissonance in the build-up to and aftermath of the Iraq War. The Chilcot report made pointed criticisms over the legal advice, lack of cabinet oversight and post-war planning and policy. But let us focus on the way the primary evidence used to justify war - namely, the existence of WMD - was serially reframed.

On 24 September 2002, before the conflict, Tony Blair made a speech where he emphatically stated: "His [Saddam Hussein's] WMD programme is active, detailed and growing… he has existing plans for the use of weapons, which could be activated in 45 minutes…"

The problem with this claim is that Saddam's troops didn't use such weapons to repel Western forces, and the initial search for WMD drew a conspicuous blank. And yet, as the social psychologists Jeff Stone and Nicholas Fernandez have pointed out in an essay on the Iraq conflict, Blair didn't amend his view - he reframed the evidence. In a speech to the House of Commons, he said: "There are literally thousands of sites... but it is only now that the Iraq Survey Group has been put together that a dedicated team of people… will be able to do the job properly… I have no doubt that they will find the clearest possible evidence of WMD."

The Chilcot report

For BBC News reports and analysis of the Chilcot report, click here

So, to Blair, the lack of WMD didn't show that they were not actually there. Rather, it showed that inspectors hadn't been looking hard enough. Moreover, he had become more convinced of the existence of WMD, not less so.

Twelve months later, when the Iraq Survey Group couldn't find the weapons either, Blair still couldn't accept that WMD were not there. Instead, he changed tack again arguing in a speech that "they could have been removed, they could have been hidden, they could have been destroyed".

So now, the lack of evidence for WMD in Iraq was no longer because troops hadn't had enough time to find them, or because of the inadequacy of the inspectors, but because Iraqi troops had spirited them out of existence.

Image source, AP
Image caption,

Tony Blair in Iraq after the 2003 invasion - failure to find WMDs did not change his mind

But this stance soon became untenable, too. As the search continued in a state of desperation, it became clear that not only were there no WMD, but there were no remnants of them, either. Iraqi troops could not have spirited them away.

And yet Blair now reached for a new justification for the decision to go to war. "The problem is that I can apologise for the information that turned out to be wrong, but I can't, sincerely at least, apologise for removing Saddam," he said in a speech. "The world is a better place with Saddam in prison."

This is not intended as argument against Blair - rather, as an illustration of the reach of cognitive dissonance. Indeed, when you read the Chilcot report, this tendency, not just with regard to WMD, peppers almost every page.

Image caption,

Chilcot report: Illustrations of cognitive dissonance "pepper" nearly every page

Science has changed the world because it prioritises evidence over conviction. Judgements are subservient to what the data tells us. The problem is that in many areas of our world, evidence is revised to fit with prior assumptions - and the tragedy is that we are often unaware of this process because it happens subconsciously. It is noteworthy, for example, that the Chilcot report nowhere states that Blair was actively deceitful.

The good news is that we can combat this tendency, and measurably improve our judgements, when we become alert to it. Indeed, the hallmark of pioneering institutions is that they deal with cognitive dissonance not by reframing inconvenient evidence, but by creating systems that learn from it (and thus avoid related biases such as "group think"). This should be the most important lesson of Chilcot.

When so-called Islamic State launched a major offensive in Iraq in 2014, and the country was on the brink of a civil war - which some commentators linked to the 2003 invasion - Blair found another avenue of justification.

He pointed to the policy of non-intervention in Syria, which had descended into its own civil war. In an article written for his personal website, he said: "In Syria we called for the regime to change, took no action and it is in the worst state of all." In other words he might be suggesting: "If things look bad in Iraq now, they would have been even more awful if we had not invaded in 2003."

For our purposes, the most important thing is not whether Blair was right or wrong on this point, one which he re-affirmed this week. The vital thing to realise is that had non-intervention in Syria achieved peace, Blair would likely still have found a way to interpret that evidence through the lens of the rightness of his decision to invade Iraq. In fact, he would probably have become more convinced of its rightness, not less so.

And this is why the Chilcot report, despite its mammoth detail, will have little effect on the core judgements of those involved with the Iraq War. As with everything else, it will simply be reframed.

Matthew Syed is the author of Black Box Thinking: Marginal Gains and the Secrets of High Performance

Follow @BBCNewsMagazine, external on Twitter and on Facebook, external