By Robert E. Mittelstaedt, Jr. (Wharton School Publishing)

 

From the Introduction:

 

What do the failure of Enron, the Watergate scandal, the nuclear accident at Three Mile Island, and most airline crashes have in common? Quite simply, it would be almost impossible to make each of these things happen without a serious sequence of errors that goes unchecked. Whether it is a physical disaster, a political blunder, a corporate misstep, or a strategic mistake, as the investigation unfolds, we always find out that it took a unique set of compounding errors to bring the crisis to front-page status.

 

In many cases, these blunders are so complex and the impact so serious that we find ourselves saying, “You couldn’t make that happen if you tried.” The difference between organizations that end up on the front page of a national newspaper in a negative light, those you never hear about, and those that end up on the front page in a positive light, is the process of Managing Multiple Mistakes (M3).

 

It has long been known that most man-made physical disasters are the result of a series of mistakes. In most cases, if one can find a way to “break the chain,” a major catastrophe can be avoided. This recognition of failure chains in operating aircraft, trains, nuclear power plants, chemical plants, and other mechanical devices has led to an emphasis on understanding causes and developing procedures, training, and safety systems to reduce the incidence of accidents and to mitigate damage if one does occur. Strangely, there has been little emphasis on extending this process to help avoid business disasters—whether operational or strategic.

 

Enron, WorldCom, and HealthSouth are now widely known as major business disasters. Enron might even be classified as a major economic disaster given the number of employees, pensions, and shareholders affected at Enron and their accountants, Arthur Andersen. As investigations unfolded, we learned that none of these was the result of a single bad decision or action. Each involved a complicated web of mistakes that were either unnoticed, dismissed as unimportant, judged as minor, or purposely ignored in favor of a high-risk, high-payoff gamble.

This book is about the avoidable traps that we set for ourselves as business people that lead to disasters. It is about what we can learn from the patterns of action or inaction that preceded disasters (sometimes called “accidents”) in a variety of business and nonbusiness settings in order to avoid similar traps and patterns of mistakes. This goes beyond kaizen and six-sigma on the factory floor to M3 in the executive suite and at all operational levels of companies.

 

This is not a book about crisis management. It is not about managing public relations, the victims, the lawyers, or the shareholders. It is about discipline, culture, and learning from the experiences of others to improve the odds that you can avoid the things we label as accidents, disasters, or crises altogether. Even if you do not totally avoid such situations, knowledge of the typical patterns that occur should help you create an organization that is observant enough to intervene early and minimize damage. Learning and implementing the lessons described here will not mean that you throw away your plans for handling problem situations. But it could mean that you will never have to manage the aftermath of an unpleasant situation.

 

There are lessons to be learned from looking at the mistake patterns and commonalities in other organizations, especially since most organizations do not do a very good job of evaluating their own mistakes even though they have the most information. We miss learning opportunities by not being curious enough to look deeply at our own failures, but we also miss a very rich set of opportunities when we do not look at the mistakes others have made, especially when they have been well documented. We often miss these opportunities to learn from others because we believe, “Their situation was different—we don’t have much to learn from them.”

 

The reality is very different because studies show that while the specifics may be different across industries and situations, the patterns of mistakes preceding accidents are quite similar. Learning doesn’t always come from the sources you expect, like your own experience, your own industry, or very similar companies. It takes a bit of extra effort, but you can often learn more by looking at examples in an industry or situation that is markedly different from your own and recognizing that there are great similarities in the patterns of actions and behaviors. This is because without the burden of a set of assumptions around what you “know” is the right or wrong way to do something, it is easy to observe the salient facts, absent all the distracting details, and quickly say to yourself something like:

 

  • Didn’t they know water would boil if they lowered the pressure? (Three Mile Island )
  • Why did they fail to follow the procedure and fly into the ground? (Korean Air)
  • Didn’t they know customers would want a replacement for a defective chip? (Intel)
  • Don’t they know that customers are often more loyal if you admit a mistake and fix it? (Firestone)
  • Didn’t they know the leverage and/or fraud might kill the company? (Enron, WorldCom, HealthSouth)
  • Didn’t NASA learn anything the first time? (Columbia )
  • Why is J&J legendary for its handling of the Tylenol crisis over 20 years ago?
  • How did a United Airlines crew minimize loss of life with a crash landing where “everything” went wrong? (UA-232 at Sioux City, Iowa)

 

In each case of a crisis with an adverse outcome, there is a very common pattern:

 

  • An initial problem, often minor in isolation, that goes uncorrected
  • A subsequent problem that compounds the effect of the initial problem
  • An inept corrective effect
  • Disbelief at the accelerating seriousness of the situation
  • Generally, an attempt to hide the truth about what is going on while an attempt is made at remediation
  • Sudden recognition that the situation is out of control or “in extremis”*
  • Finally, the ultimate disaster scenario involving significant loss of life, financial resources, or both, and ultimately, the recriminations

 

We will explore a number of famous and not-so-famous disasters or near disasters from the perspective of the mistake sequence and where it might have been broken to change the outcome or was broken to minimize the damage. We will call your attention to the mistakes so that you might think about the signals that were present and how you, in an ideal world, might have acted differently.

 

The mistakes identified are usually the result of direct action or inaction by humans. In many scenarios, the mistake sequence was initiated with equipment malfunctions that were known but not taken into account in decision-making. In other situations, the mistakes may have been in the design of systems or business procedures that were based on faulty assumptions. Sometimes there were significant, uncontrollable initiating or contributing factors, such as equipment failure, a natural weather occurrence, or some other “act of God.” These initiating factors must be considered in decision-making when they are present because, although they are not always human in origin, they are a part of the chain of causes that leads to disasters where humans have an opportunity to intervene effectively or ineffectively.

 

In the past, you may have looked at the occurrence of disasters or recovery from near-disasters as a matter of passing interest in the news. We are suggesting that you look a little deeper, learn a little more, and stretch a little further for the implications that you can use:

 

  • Is there a disaster waiting to happen in my organization?
  • Will we see the signs?
  • Will we stop it soon enough?
  • Do we have the skills to see the signals and the culture to “break the chain?”
  • Are we smart enough to realize that it makes economic sense to care about reducing or stopping mistakes?

 

Learn from the mistakes of others and envision business success without mistakes, because your future may depend on your ability to do just that. To aid in this quest, we will identify some “Insights” linking common themes that come out of the study of mistakes across industries and situations. These will appear appropriately in each chapter and will be summarized in a broad way again in Chapter 10, “Making M3 Part of Your Culture for Success.”