The views expressed by contributors are their own and not the view of The Hill

How to stop cognitive bias from affecting our decisions


On Jan. 27, 1986, engineers at Morton Thiokol contacted NASA to advise calling off the launch of the Space Shuttle Challenger the next morning. The forecast called for freezing temperatures between 26 and 29°F. Concerned the O-rings on the booster rockets would not seal and cause an explosion, the engineers scrambled to explain their first no-launch recommendation in 12 years. They faxed NASA 13 charts, identifying launches where the O-rings showed signs of failure.

NASA carefully inspected the charts, and saw no relation between temperature and failure. NASA told the engineers to reconsider their recommendation. We all know the deadly consequence of the ultimate decision to launch the next morning. What led such accomplished professionals to such different conclusions from the same data?

Confirmation bias. Throughout history, when business executives, politicians, and intelligence analysts allow confirmation bias to inform their decisions, the consequences can be staggering and even deadly. From feeding political partisanship to the Space Shuttle Challenger disaster to the 200,000 deaths in Iraq and $2.4 trillion spent on war since 20o1, confirmation bias has impacted our nation in settings from the situation room to mission control to our dinner tables.

Until recently, most of the research said that learning about bias does little to rid us of it — that nudges that change the way decisions are made, or new incentives, are only way to reduce bias in our professional and personal lives.

New evidence suggests this postmortem was premature. Training can improve our decision making and reduce confirmation bias.

How can we achieve this?

Debiasing ourselves begins by understanding our biases and their effect on our decisions. Confirmation bias, for instance, is a tendency to preferentially test, search for, and interpret evidence supporting existing beliefs, hypotheses, and opinions.

Next, it’s useful to understand how it affected consequential decisions in the past.

Morton Thikol engineers showed confirmation bias in the evidence they presented to NASA. Their charts only included the small number of previous launches with O-ring failures.

NASA engineers showed confirmation bias because they didn’t look at all of the launches — successes and failures — when making their decision. If they had, they would have seen that O-ring failures frequently occurred at low temperatures. And the forecasted temperature on Jan. 28 was 26°F, much colder than the lowest temperature of any previous launch (53°F).

Beyond examining our own and others’ decisions to understand confirmation bias, we can apply strategies to reduce its impact on future decisions. One strategy is to make sure to include a search for evidence that could disconfirm the theory you are testing. Be your own devil’s advocate. Another is to ask a balance of questions that, if confirmed, would provide evidence for and against your theory. A third is to take the factor that you think causes an effect out of the equation, and see if you get the same outcome. If the past repeats itself, you were wrong.

How effective is debiasing training? I conducted a recent field study with Professors Anne Laure Sellier of HEC Paris and Irene Scopelliti of Cass Business School. Participants were given a one-shot training, which substantially reduced confirmation bias in Carter Racing, a business simulation modeled on the decision to launch the Space Shuttle Challenger.

Participants who received the training were 29 percent less likely to make the equivalent “launch” decision than participants who did not yet go through the training. Analysis of their written explanations revealed that the training reduced the number of arguments they generated in support of the “launch,” which improved their decision making.

The training intervention was one of two serious games I developed with a team of researchers for the Office of the Director of National Intelligence to reduce six cognitive biases in U.S. government intelligence analysts. Game players are taught about each of the six biases at an abstract level and then learn to map each bias to different problems and domains through game play, narrative examples, and practice problems. In laboratory studies, we have found the games reduce commission of the biases for as long as three months after training.

The Space Shuttle Challenger disaster and the Iraq War are obviously extreme examples, but reducing confirmation bias through training could benefit leaders like business executives and managers, politicians, and organization leaders in many important and everyday decisions.

As managers, we create confirmation bias in our organizations when we direct ourselves or our teams to only examine the potential upside of a new idea, product, or investment, rather than to also see why it might fail. We are then using data to support rather than make a decision. Confirmation bias also influences decision making at personal level. In hiring, we often have a biased idea of who would be good for a job, and miss out on candidates who would bring fresh ideas and perspective to the business. Or we have a preconceived idea of who will perform well and ignore the breaks we give favored employees to make sure they do.

Debiasing research is a promising, growing literature. We are still learning much about how and when to debias people. But the rewards of teaching ourselves about bias could have tremendous benefit to our professional and personal lives.

Carey Morewedge is a professor of marketing at Boston University’s Questrom School of Business. Follow him on Twitter @morewedge