Skip to main content

How to Avoid Business Disasters with Behavioral Science

Data breaches, customer service embarrassments and other stock-tanking missteps seem to be in the news every other day—but it doesn't have to be this way

This article was published in Scientific American’s former blog network and reflects the views of the author, not necessarily those of Scientific American


The Cambridge Analytica data-breach scandal at Facebook led to a massive fall in the company’s stock price, governments around the world discussing new regulations, commenters from the left and the right calling for breaking Facebook up, multiple lawsuits and many Facebook users leaving the site. After three recent incidents with pets on United Airlines, the company’s favorability rating fell and customers who heard about the incidents were significantly more likely to choose another airline. Equifax is in the news—again—for discovering an additional 2.4 million victims of its data breach.

You might be surprised to learn that all of these disasters were avoidable, as are many disasters suffered by all of us in our professional lives that do not make it into the news. Our brains make systematic and predictable errors—what behavioral scientists call cognitive biases—that lead us to make poor decisions. Fortunately, recent research shows that we can easily improve our ability to make better decisions. As a behavioral science expert, a consultant and speaker on decision-making and author of an Amazon bestseller on this topic, The Truth-Seeker’s Handbook: A Science-Based Guide, I want to suggest an effective, research-based approach that anyone can use to avoid disasters in the workplace.

Were These Disasters Really Avoidable?


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


The CEOs of all three companies recognized that they and their organizations made errors that led to the business disasters. Facebook’s CEO Mark Zuckerberg acknowledged that Facebook made many mistakes, such as allowing external apps to have massive data access to Facebook users, trusting the word of Cambridge Analytica and other companies about what they do with the data and failing to inform Facebook users about how their data is used.

Oscar Munoz, United’s CEO, also stated that United got it wrong. He committed not only to change the way United transports pets, but to put United crews through a new program focused on “safety, caring, dependability and efficiency.” Equifax’s former CEO Richard Smith, who was forced to retire after the data breach, apologized for the failure of Equifax to follow its own policies and procedures. Apparently, after the Department of Homeland Security warned Equifax of its vulnerability, the company failed to follow its own process and fix the security flaw, enabling hackers to access the data of over 140 million customers.

Why Do We Suffer Avoidable Disasters in the Workplace?

Avoidable professional disasters frequently stem from three major cognitive biases: the overconfidence effect, optimism bias and the planning fallacy.

When asked whether they are more, less or equally skilled compared to the average driver, 93 percent of Americans report themselves as more skilled. When study subjects said they were 100 percent confident in their answers, they were wrong 20 percent of the time. No wonder that the overconfidence effect—our tendency to be excessively confident in our decision-making—has been found by researchers to harm performance in the workplace, whether by CEOs or ordinary professionals. Indeed, Zuckerberg describes how he was wrong in being confident that Facebook users prefer to be more public with their data for the sake of more opportunities for social engagement.

Optimism bias refers to us being excessively optimistic about the future. For example, studies show we tend to believe our risk of suffering negative events is less than it actually is, and we overestimate the likelihood of positive events. We fall into optimism bias frequently in the workplace, overemphasizing the benefits of projects and understating the costs. A related bias, the planning fallacy, refers to our tendency to assume our plans will go perfectly and fail to build in enough resources for potential problems. As a result, endeavors tend to run over budget and past deadlines. Both United and Equifax were excessively optimistic about their existing processes, the first in dealing with pets and the second in fixing the security flaw after it was found.

Addressing Avoidable Business Disasters with Behavioral Science

Facebook, United and Equifax have been evaluating what went wrong and how they should fix it, an approach called a postmortem. To prevent these disasters, they could have used a technique known as a premortem. This approach, which has been shown to address cognitive biases that lead to business disasters, involves evaluating in advance what can go wrong with a project or process and coming up with solutions to foreseeable problems.

To conduct a premortem, first gather a team of relevant stakeholders, consisting of a mix of people with decision-making authority and expertise in the matter under evaluation. If you are doing this by yourself, ask a couple of fellow professionals or friends who know you well to help you out.

Then ask everyone to imagine that the project or process definitely failed. Ask them to write out anonymously some plausible reasons why it failed, ranging from internal problems within the organization to external events. Encourage participants to focus particularly on reasons they would not typically bring up because it would be seen as rude or impolitic, such as criticizing someone’s competency, or even dangerous to one’s career, such as criticizing the organization’s strategy.

Next, the facilitator would gather the statements and bring out common themes, especially ones that might not be typically brought up. The participants would discuss the potential reasons for failure and assess, once again anonymously, the likelihood of each reason for failure. Following that, the stakeholders would brainstorm solutions for failures and consider possible next steps for implementing these solutions to revise the project or process as needed.

Premortems conducted regularly to evaluate existing processes could have caught the kinds of issues that led to disasters for United and Equifax. They can also be conducted in response to potential problems, such as when Facebook found out in 2015 that Cambridge Analytica got data improperly and failed to take meaningful steps to address the problem, despite being warned by regulators in 2011. Any professional can conduct premortems regularly on existing processes and before launching new projects to avoid disasters.