Site icon The Choice by ESCP

Bad decisions put to the test on Mount Everest

©Howling Red/Unsplash. Sunset on Mount Everest, middle.

©Howling Red/Unsplash. Sunset on Mount Everest, middle.

In May 1996, two rival mountaineering companies embarked upon a new ascent of Mount Everest. The expeditions were led respectively by the charismatic Rob Hall and Scott Fischer, who aimed to take their new clients to the top of the world. On the 10th of May, the two teams were descending from the summit when they were caught out by a violent storm. The leaders of both groups of climbers, as well as two of their clients and a guide, died that day, unable to reach the nearest base camp.

Climbing Everest is no trivial matter. Since the first ascent in 1950, 280 people are believed to have met their end there. Though it is still not the world’s most dangerous mountain: Annapurna I, K2 and Nanga Parbat all have fatality rates of between 20% and 28%. Compared with these deadly rivals, it could be argued that Everest has been the most climbed and is one of the safest mountains. So why is it that experienced, well-equipped professionals (who had, among other things, bottled oxygen) ended up making a series of bad decisions that ultimately condemned them to death?

Cognitive biases, or why our choices are not always logical

In 1970, two big names from the world of psychology, Amos Tversky and Daniel Kahneman, sparked a revolution by proving that humans are not rational creatures and that our choices are not always logical. In other words, many of our judgements, choices and decisions are often the result of largely arbitrary rules, rather than of statistical and logical calculations. This is mainly due to the existence of “cognitive biases”, a sort of subconscious neurological mechanism that can influence or disrupt our relationship to reality, impacting our opinions and behaviours without us noticing.

According to Kahneman, the source of these biases lies in the complex interplay of two systems in our brain, which he labels “System 1” and “System 2”. He argues that in System 1 the brain forms thoughts in a fast, automatic way. It makes decisions spontaneously and instinctively, often guided by emotions. In System 2, thoughts are formed in a slower, more deliberative way. The brain makes decisions consciously and rationally.

The downside of System 2 is that it assumes you have the time and desire to engage in deep reflection. For practical reasons, most of our day-to-day decisions are therefore made by System 1; if all our decisions were based on a drawn-out thought process, it would take days and we would have no time for anything else.

Most of the time, the shortcuts that our brain creates via System 1 are both useful and sensible. They allow us to quickly sort and interpret the information we receive, so that we can take action. Sometimes, however, things can go wrong. The shortcuts taken by our fast brain (System 1) can at times prove harmful. In such instances, we talk about “cognitive biases” – the name given to the many shortcuts in thinking that, rather than taking us safely and efficiently to the right destination, instead lead us into trouble.

Cognitive biases tested on Mount Everest

In 2002, six years after the incident on Everest, Michael Roberto, a research assistant at Harvard Business School, investigated the accident that cost the lives of some of those on the expedition. After analysing the accounts of the survivors and questioning them carefully on the sequence of events leading up to the incident, he established that at least three key cognitive biases could have led to bad decisions being made during the ascent, contributing to the catastrophic outcome.

The “sunk cost fallacy” entails sticking with a decision that you know is a bad one for fear of wasting the sacrifices you have made so far. This could explain why the climbers persisted in their ascent despite a dire weather forecast and in full knowledge of the risks they were running, particularly not having enough energy or oxygen to make the descent. In his account of the expedition, the American journalist Jon Krakauer – who had been invited to join the expedition – recalls how difficult it was for certain members of the team to accept that they should turn back when the decision needed to be made. They had sacrificed so much, financially, physically and psychologically to get to where they were, that it seemed unthinkable to give up so close to the goal. In the words of Doug Hansen, one of the client mountaineers who died that day: “I’ve put too much of myself into this mountain to quit now, without giving it everything I’ve got”. Although Rob Hall, the leader of the expedition, was fully aware of the dangers involved in continuing the ascent and had himself set out rigid rules designed specifically to avoid any temptation to break them in a moment of weakness, he was unable to stand up to one of his clients. Worse still, he accompanied Doug Hansen on his final ascent, knowing full well that they would not have enough oxygen or strength for the descent. Neither man made it down.

They had sacrificed so much, financially, physically and psychologically to get to where they were, that it seemed unthinkable to give up so close to the goal.

The overconfidence bias involves overestimating our capabilities, skills or the strength of our judgement at the time of making a choice. It could explain the guides’ poor assessments of their own abilities. It is worth remembering that the leaders of the two expeditions were experienced mountaineers who had both climbed Everest several times and successfully taken dozens of people to the summit over the five previous years. Fischer was therefore used to brushing aside the doubts of his clients, reassuring them that everything was under control. He is said to have told a journalist: “I believe 100 percent I’m coming back…. My wife believes 100 percent I’m coming back. She isn’t concerned about me at all when I’m guiding because I’m gonna make all the right choices”. As Michael Roberto noted in his analysis, Fischer was not the only one to have boasted about having complete confidence in himself. Several other members of the expedition had similarly overestimated their physical and psychological capabilities and were in no doubt as to their ability to successfully reach the summit and make it back down.

The “recency bias”, potentially interpreted as a variant of confirmation bias (only seeing what we want to see) also seems to have played a role in the Everest drama. This biais could be what led the expedition leaders to underestimate the likelihood of a violent storm catching them off-guard, and therefore failing to prepare for such an eventuality. The weather during their recent expeditions had been good; in fact, it had been particularly mild over the previous five years. They therefore only took these medium-term observations into account, subconsciously ignoring the older weather data that clearly showed that storms were a frequent occurrence on Everest. Had they widened the scope of their analysis to include this older data, they would have recalled that in the mid-1980s – just ten years earlier – there had been no expeditions up Everest for three consecutive years due to bad weather.

A fourth bias could be at play in these events: the “authority bias”. The fact that there was a defined hierarchy among the climbers meant that some of them, particularly the guides, were reluctant to question the decisions of the leaders, even at times of crisis. For example, one of the guides is reported to have said that he did not dare express his disagreement with some of the decisions made by Rob Hall (the expedition leader), preferring to defer to the experience, status and authority of his “superior”, rather than voicing his reservations. More generally, Michael Roberto highlights that the team lacked “psychological safety” – a climate in which everyone feels able to speak up and debate differing viewpoints. Had there been a feeling of mutual trust and opportunities to calmly discuss possible differences in opinion, certain biases may well have not undermined the rationality of the expedition leaders when it was at its most vulnerable.

Understanding and recognising the existence of biases can help us to avoid certain traps, particularly when we have to make crucial decisions, whether it’s coming back from Everest safe and sound or not ruining your business (or your life).

Besides demonstrating the disastrous consequences cognitive biases can lead to in high-risk situations, the case of the 1996 expedition also shows how difficult it can be to resist them, even when we are aware of their existence and the risks they pose. The strict rules that the mountaineers had set for themselves before making their ascent – including turning back if they had not reached the summit by 2.00 p.m. – were specifically intended to protect them from any biais: they knew all too well that their judgement would be seriously impaired while approaching the summit as a result of extreme fatigue and lack of oxygen. But instead of abiding by their rules, they let their biases take over and lead them straight to their death.

Specialists believe that over 90% of our mental behaviour is subconscious and automatic, and thus that our judgements and behaviours are dictated without us being aware of it. In this context, understanding and recognising the existence of biases can help us to avoid certain traps, particularly when we have to make crucial decisions, whether it’s coming back from Everest safe and sound or not ruining your business (or your life).

Exit mobile version