Trams inquiry will offer the chance to challenge the cognitive biases around the key decisions in debacle, writes John Sturrock
IT IS refreshing that Lord Hardie has set out clear details for his inquiry into the Edinburgh trams project.
It will be fascinating to see how extensively it explores the underlying reasons for decisions that were taken. There is much research into what are described as “cognitive biases”, the errors of thinking that we all tend to fall into, especially when under pressure. It could be a real breakthrough to understand the impact these may have had on choices made and outcomes reached regarding the trams. It might not be overstating things to suggest this inquiry could become a world leader in identifying some of the habits and errors which neuro-science shows affect real life decision-making.
How might these apply to the trams project? Rolf Dobelli’s The Art of Thinking Clearly addresses many of them in 99 short chapters.
Take “optimism bias” or over-confidence. “It will only cost £x and we can confirm that this project will certainly be delivered within budget and on time.” How often do we hear this and believe it? People systematically overestimate their knowledge and ability to predict. This is also an example of what is called “strategic misrepresentation”, which afflicts major projects in particular, and the “planning fallacy” when groups overestimate benefits and underestimate costs and risks.
Many public procurement exercises now have a built-in provision for this bias. Under-pricing may be a significant problem in construction contracts, at least where cost is the major factor in deciding on tenders, with contractors seeking to make up for under-pricing by other claims. No doubt this may be raised in the inquiry, especially as it can have an “anchoring” effect, a false baseline for judging subsequent performance.
Then there is group-think, peer pressure or the herd instinct. If one or more of us forms a view, and our status lends authority to it, others will tend to follow for fear of being different, or looking silly. Even if someone knows that something is not right, they filter out that doubt if it does not fit the generally agreed picture. “If others all agree, I must be wrong…” It is well known that the Challenger space shuttle was lost because of this thinking. Team decision-making may be a good thing but the effect can be diffusion of responsibility.
This can be an illustration of “authority bias” – when someone is apparently expert in a field, we find ourselves deferring to them, even if the objective evidence stacks up against their view. Thus, airlines have instituted “crew resource management” to encourage co-pilots to discuss reservations with the captain, openly and quickly. Challenge is vital. What might have happened with RBS if such an approach had prevailed in its boardroom?
We might link this with over-reliance on intuition, which apparently afflicts us the more senior we become. “I have a sense that this is what we should do…” “My gut instinct and/or experience tells me…” This can lead us to ignore the most obvious contradictory information. “Confirmation bias” is another familiar misconception. We interpret all new information to fit our existing conclusions or beliefs.
And then “cognitive dissonance” creeps in. We’ll find a way to explain that view, even though it is objectively wrong. Add into the mix “flashbulb memories”, the unassailable belief that we have a clear recollection of past events; even memories which are extraordinarily vivid and detailed can turn out to be riddled with inaccuracies.
Lord Hardie is charged with finding answers. Here, hindsight- (or outcome-) bias could be a concern. In retrospect, many things seem clear and inevitable. But that may be a mistake. In a complex world, it is far more difficult to forecast an outcome than to fit it all together afterwards. A bad result does not necessarily imply a bad decision – and vice versa. To assess the quality of a decision, we must use the information available at the time, filtering out everything learned afterwards. It could be easy to understate the impact of the unforeseen or unforeseeable (“Black Swans” in the jargon, like the 2008 crash – or the unexpected presence of underground services in Edinburgh?).
Cognitive biases reflect the way we are wired. Much of this is unconscious or intuitive, based on our evolutionary instincts. We need to find practical and sensible ways to address these in major projects. This is where the Hardie Inquiry could be a world-leader.
• John Sturrock is chief executive, Core Solutions www.core-solutions.com