Home Mind & Brain The 1986 ‘Challenger’ Disaster Informs the COVID-19 Pandemic: How to Make Bad and Good Decisions about Risk

The 1986 ‘Challenger’ Disaster Informs the COVID-19 Pandemic: How to Make Bad and Good Decisions about Risk

Published: Last updated:
Reading Time: 6 minutes

The Challenger space shuttle exploded shortly after lift-off, killing seven people. The disaster took place decades ago and was not caused by a virus. Nonetheless, it holds lessons for making the best possible decisions as we open and close during the pandemic.

Analogies are always imperfect, but they can help us know what to consider before making big decisions. As you read the following summary of the Challenger tragedy, please consider the too-risky choices that led to the deaths of those seven astronauts and the lessons those flawed judgements offer. Most vital and urgent, apply them to your future decisions about COVID-19.

Countless decisions everywhere, good ones required

The October 2019 Global Health Security Index ranked the US and the UK the two best prepared for a pandemic of all 195 countries ranked. By June 2020, those countries were again #1 and #2, but this time the criterion was ‘excess deaths’ compared with the number expected during no pandemic.

Terrible decisions are causing these horrifying results. As the pandemic continues to challenge us, we need as many people as possible to make as many good and as few bad choices as possible about:

  • How to protect ourselves and others from infection (wearing masks, washing hands, and  distancing);
  • Whose lead to follow (governors, mayors, Trump administration, public health experts, friends, doctors); and
  • Whether, when, and how to open and close our schools and businesses.

With countless choices to make, most of us will make some good decisions and some bad ones. Making those decisions thoughtfully, as opposed to instinctively, is key to making as few mistakes as possible.

The Challenger disaster

The essential lessons for decision-making from the Challenger disaster are psychological and behavioral. At Morton Thiokol, the most relevant contractor, some engineers and managers had worried for years about O-ring defects. Early on, engineers discovered evidence of flaws in the rubber gaskets that sealed one of the rocket’s boosters. Still, the problem didn’t seem too severe to decision-makers, who deemed the risk acceptable.

Eventually, as O-ring problems appeared in seven of the nine 1985 shuttle launches, engineers and some managers came to believe strongly that O-ring failures could prove deadly. Further evidence convinced them that the most severe danger came in the coldest temperatures.

Evidence kept mounting, and concerns kept growing. Engineers began a process of redesign. But the engineers’ warnings weren’t heeded by NASA officials.

On the day of the Challenger launch, the temperature was below freezing, far lower than at any previous launch. Ice all over the launching pad caused serious concern. An ice team worked all night to remove it, and the mission manager in Houston postponed the launch by an hour. The added time would allow the temperature to rise a bit, and the ice team could inspect the Challenger again and clear it for launch.

The Morton Thiokol engineers were terrified, and their managers recommended that NASA – the customer – not launch. But NASA managers were under time pressure. They argued against cancellation, violated several mission rules, and decided to launch. Seventy-three seconds after lift-off, the Challenger broke apart.

Lessons for our Pandemic choices

Every analogy is imperfect, but the Challenger offers vital lessons for making pandemic-related decisions:

  • Recognise your all-too-human biases

When confronting risk and uncertainty, we rarely make decisions that are as good as we think. For instance, the danger repeatedly described by Thiokol engineers and their managers seemed to be a more acceptable risk to NASA decision-makers than failure to meet the schedule.  

I must acknowledge the advantage of 20/20 hindsight. The point is not about the NASA decision-makers so much as tendencies we all fall prey to. We often care more about speed than quality, which leads us to make fast decisions rather than good ones. A prime example as regards COVID-19 was opening non-essential businesses prematurely.

  • Rely not on instinct but good thinking

We make many decisions based on automatic, thoughtless, fast-but-biased System 1 information processing. In contrast, System 2 processing takes more time and effort, is deliberative and thoughtful, and is likely to result in more rational choices, offering the greatest net benefit or least harm.

Instinct can work, but mainly if you have deep expertise and familiarity with the problem you face. More often than not, relying on instinct without slowing down to seek information and think results in less-than-optimal results; in the case of COVID-19, unnecessary illness and avoidable deaths. 

  • Consider the consequences of your instinctive choice and its opposite

Rationality requires thinking beyond your natural preference – for instance, attending a fun event – and considering at least one alternative (not going). It also requires thinking through the pros and cons of both options.

Crucially, some costs and benefits are immediate while others are long-term, and the differences can be immense. All else equal, immediate consequences influence us more than what might come later, presenting a conflict between what we want to do (now) versus what we should do (for the eventual best). We give in to temptation (overeating, overdrinking, skipping a workout or a class) because we enjoy that activity at the moment, but we regret the consequences later. ‘want’ versus ‘should’ helps explain many unsafe pandemic choices.

  • Consider ethical implications

Assessing benefits and risks can be a purely selfish calculus but does not need to be limited to concern for oneself. Ethics is a personal choice about whether and how to weigh the effects of your actions on others.

In addition to the short-term pleasure of choosing ‘want’ over ‘should’, a person’s values can explain their willingness to take seemingly ill-advised risks. Defying safety recommendations sometimes stems from valuing individual freedom and not wanting others to tell us what to do.

We all value our freedoms; we all value our lives. What makes decisions with ethical implications difficult is that they require prioritising one value over others. With pandemic decisions, free choice takes priority for some people, while a desire not to harm family, friends, and even strangers takes precedence for others.

  • Beat the false choice with third options

Even when feeling caught between a rock and a hard place, you usually can find more than two options. In the heat of the moment, the Challenger options seemed to be to launch or to cancel. But a viable and safe third option – possibly not even considered after the first delay – was to wait for a warmer day.

Sadly, some officials view the pandemic as a string of zero-sum, economy-versus-health decisions. But deciding when to go back to work – for instance, when your boss demands your return, and you need the money but are wary – might include additional options, such as allying with co-workers to initiate safer practices and negotiating best paths forward with the boss.

Framing a false choice of health versus economy already has caused big mistakes and will cause more; the two options are not separate and are not either/or. Helping the economy was an intended benefit of reopening businesses, but the early reopening hurt public health, which then caused people to be wary of returning to stores and restaurants. We must solve the pandemic to bring back long-term prosperity; opening too soon, hiding pandemic facts, fudging the data, and conveying false optimism avoid the real dangers of the pandemic and short-circuit the economic recovery.

  • Know when to change course

NASA had put a lot of resources (sunk costs) into the launch, making it difficult psychologically to cancel. Public officials who reopened too early made mistakes, but some heeded the worsening virus data and reversed course sooner rather than later.

  • Beware of overconfidence

More often than not, confidence is beneficial. But we all know that overconfidence and cockiness can cause bad decisions; not every time, but enough to suggest that you stop and reconsider before taking a dangerous plunge.

Extreme overconfidence is a feature of narcissism. Most decision-makers are not narcissists, but all should be wary of taking advice from one.

  • Don’t let hope block action

The Challenger decision-makers hoped the O-ring problem would go away. They hoped the engineers were wrong, and the redesign efforts would work. They hoped the ambient temperature might be high enough. We all have analogous hopes about the virus – including some public officials’ fantasies, stated and repeated with apparent confidence that some listeners buy.

Hope is not a plan, a decision, an action, or a solution. Effective action is the ultimate superpower.

  • Weigh data properly

In the Challenger story, the decision-makers weighted heavily the recent string of accident-free launches and the morning’s temperature increase thanks to the postponement. The warmer temperature and previous launch successes were top-of-mind: memorable, salient, and useful for driving and defending the launch decision.

One angle on this is that we cherry-pick data, which helps us rationalise our decisions with inadequate, often psychological, defences. 

  • Avoid the not invented here (NIH) bias

We value our own opinions and ideas more than those of other people. The NASA decision-makers did not accept urgent, informed advice from Thiokol engineers and managers. NIH is an acronym worth remembering, so you don’t reject the best ideas simply because they weren’t your own or don’t fit with your instinctive biases.

That’s not a personal insult. NIH is a common human bias, like the others above. With knowledge and effort, we can override such preferences, make better choices, and obtain better outcomes.

  • While progressing, suppress the backslide

We made progress against COVID-19 and then backslid. We stopped being vigilant and, when we didn’t immediately get sick ourselves, took more risks. As the axiom goes, two steps forward, one step back.

Moral licensing is when we give ourselves credit for doing the right thing and then slacken. This will hurt us badly in our battle against the virus. Without vigilance and a dose of grit, progress begets backsliding.

Final thoughts

The Challenger decision-makers did not adequately weigh the dangers. Thinking long and hard before making a choice that doesn’t work out due to risky, uncertain times is understandable. So is making a mistake on a close call without helpful guidance from public officials. But not taking a pandemic seriously is inexcusable.

Read The Challenger Launch Decision by Diane Vaughan for a more thorough and authoritative account of the errors and biases that led to that disaster. For updates on the pandemic and the situation where you live, read and listen to the best informed and most thoughtful government executives, health experts, and community leaders.

Biases affect decision-makers and citizens everywhere. Make the most rational, safest choices for you, your friends, and family members, and people you don’t even know. Don’t forget the Challenger. We are all decision-makers.

***

An earlier version of this appeared in Psychology Today.

***

Image credit: Pixabay


Thomas Bateman is Professor Emeritus with the McIntire School of Commerce, University of Virginia.

VIEW AUTHOR’S PROFILE

© Copyright 2014–2034 Psychreg Ltd