Alexander McCall Smith: Beware of hindsight bias in Covid fury

We’re all amateur virologists and epidemiologists now. How many conversations have you had on viruses, on the R number, on the finer points of immunity?
Medical staff wearing personal protective equipment (PPE) wait to receive coronavirus patientsMedical staff wearing personal protective equipment (PPE) wait to receive coronavirus patients
Medical staff wearing personal protective equipment (PPE) wait to receive coronavirus patients

How many times have you expressed a view on the risk of surface transmission? There may be some who have remained tight-lipped in the current crisis, but they are probably a rather small minority; most of us have been talking of little else. All of this, of course, is quite understandable. We are in disturbing and uncharted territory and it would be odd if we had nothing to say about where we find ourselves.

Now, of course, we are approaching the stage in the crisis when blame is beginning to be laid at various doors, and we all have views on that too. There have been some egregious examples of finger-pointing at an international level, even if some of the allegations of lack of openness, or what used to be called lying, may well be justified. But there is plenty of scope for much more local assessment of the response at national level, and that is where, if we have any concern for fairness, it might be useful for us to remind ourselves of hindsight bias. Before we criticise individuals or institutions for their failures, we should perhaps think a bit about how the wisdom of hindsight operates in these circumstances.

Hide Ad
Hide Ad

The psychologists have plenty to tell us here. In simple terms, hindsight bias operates when we look back on an event or series of events and say, “Of course that was bound to happen.” Such a judgement might be couched in terms of foreseeability, prompting us to say, “Yes, it was perfectly foreseeable back then that x or y would eventually happen”.

What we are doing here is asserting that if anybody had asked me back then I would of course have foreseen that x or y would happen further down the line. And because something is foreseeable, we feel that we can blame others for their failure to have foreseen it and taken timely action.

In our current context, this involves statements like, “It was foreseeable six months ago that there would be a shortage of protective equipment in the event of a pandemic”. That may be true: there may be certain consequences that are so obvious and so serious that they seem reasonably foreseeable to anybody who is reasonably alert; but as often as not our judgments of what was foreseeable are affected by our knowledge – acquired later – of what actually did happen. That is hindsight bias, and its effect is to make us much readier to attribute blame.

We take the view that because we think we would have seen this coming, others should have done so too. The problem, though, is that often we simply would not have foreseen the things we claim we would have foreseen. We’re setting ourselves up as being far more prescient than we actually are.

Psychologists have been able to demonstrate experimentally this tendency to expand the scope of the foreseeable. It is actually quite simple, and you can do a little (virtual) experiment with your friends: take any sequence of events in a causal sequence and list them as events 1 to 10. For example: (1) a dog negligently let out of garden (2) runs into road (3) where it causes motorist to swerve (4) and this causes a lamp-post to topple (5) which causes a power cut, and so on up to (10) causes a major fire in a factory. 1 is the first event in the chain of causes, and 10 is an eventual result. Show the list to your research subjects and ask them whether it would have been foreseeable at the time when event 1 occurred that in due course consequence 10 would ensue. Because they know the full sequence of events, many of the subjects – being human – will say, “Of course anybody could have told the if 1 happened then 10 would be the eventual result”.

All right. Now show another group of research subjects only the list of events 1 to, say, 5, without telling them about events 6 to 10 – in other words, show them an incomplete sequence in the chain of events. Ask them to say what the foreseeable result would be. It is possible that some may be able to correctly identify event 10 (which they have not been told about) as being likely to happen, but in practice it is far more likely that they will not be able to say what the likely consequence will be.

The implications of this are fairly clear. Hindsight bias makes ex post facto experts of us all. It leads us to believe that conduct which may have been perfectly reasonable at the time was, in fact, faulty because it failed to anticipate what we now see as the obvious. Our judgements of responsibility may therefore be far too harsh. We think we would have done much better at arming ourselves against events, but the reality is that we could well have done exactly what those whom we are criticising did. That does not mean, of course, that we cannot or should not assess what has been done, or what has failed to have been done. What it does mean, though, is that we should be very careful about attributing blame, and should take into account that those making crucial decisions often do so without knowing how events may eventually unfold. At the end of the day, the most important thing, surely, is for lessons to be learned rather than for individuals to be singled out for criticism, especially when the basis of that criticism is affected by hindsight bias.

The psychologists might have the penultimate word. A special form of hindsight bias identified in psychology is something called retroactive pessimism. This occurs when you look back at some disaster and say, “That was bound to happen – that was unavoidable”. That may be true, but such comments may simply be the product of a pessimistic cast of mind that, having seen what has happened, assumes that it was inevitable because the worst always happens. But nothing is inevitable, unless one is a determinist, which is another problem altogether, because determinism is a complete excuse – for everybody – which can’t be true. Or, if it is true, it can’t be the basis on which we lead our lives.

I knew this would happen. I knew I’d end up in deep philosophical water.

Comments

 0 comments

Want to join the conversation? Please or to comment on this article.