During the early 1980s
What do the Firestone Tire crisis, the Watergate scandal, Three-Mile Island, and most airline crashes have in common? Quite simply, it would have been almost impossible for each of these disasters to have occurred without a serious chain of unchecked errors leading up to the catastrophe. Whether it is a physical disaster, a political blunder or a serious corporate misstep, the ensuing investigation inevitably reveals that a unique set of errors combined and compounded to make the crisis front page news. The difference between organizations that are portrayed by the media in a negative light, those that are portrayed in a positive light and those that you never read about in the first place has much to do with the process of “managing multiple mistakes (M3).”
In each case with crises that have adverse outcomes for those involved, a common pattern exists. First, there is an initial problem, often minor in isolation, that goes uncorrected. Second, there is a subsequent problem that compounds the effect of the initial problem usually in conjunction with an inept attempt at correction. Third, there is disbelief at the accelerating seriousness of the situation. Fourth, an attempt is made to hide the truth about what is going on while efforts at remediation get under way. Fifth, there is a sudden recognition that the situation is out of control, or “in extremis.” And finally there is the play-out of the ultimate disaster scenario and recriminations involving significant loss of life, financial resources, or both.
Each day brings new information about the Firestone fiasco, including new attempts at finger pointing and more questions about potential multiple causes intended to diffuse responsibility. Firestone is only beginning to fully understand the severity of this crisis and the long-term threat to their business. In contrast, Johnson & Johnson’s successful handling of the Tylenol scare some years ago showed very decisive action and a total commitment to safety, regardless of the cost. Admittedly, removing pain killers from the shelves is different than removing tires and finding replacements. The lack of sufficient inventory to handle the huge demand required Bridgestone Firestone to come up with a phased replacement plan to match manufacturing capacity. One of the company’s mistakes was its decision to replace defective tires with other Firestone tires – rather than competitors’ tires – because it would cost them less. The correct action would have been for Firestone to immediately recall all the tires and announce that, if Firestone replacements were unavailable, the company would fund replacements offered by their competitors. Unfortunately, Firestone came to this as an afterthought, when it was clear how bad the consequences were going to be. It may well be too late.
Physical disasters such as Three Mile Island and most airline crashes follow similar patterns. Little problems are ignored or misdiagnosed. Failure to act early and decisively results in dire consequences. At TMI, every conceivable human error was made while the system tried to protect itself according to design. The final lapse was when plant operators forgot that water boils if you lower the pressure at a given temperature.
Unfortunately, in the case of aircraft, by the time the problem presents itself to the pilot, there most likely has been a chain of mistakes that should have been broken elsewhere, as this summer’s tragic crash of the Concorde outside Paris demonstrated. There is often a slight chance of recovery when disaster strikes during or after take-off, but success is usually dependent on extraordinary skill and luck. Such was the case in the successful dead-stick landing, on an abandoned airfield, of an Air Canada 767 that ran out of fuel in 1983. In that situation, the pilot was able to land with minimal damage to the aircraft based on his skill as a glider pilot and his knowledge of a closed airfield – proficiencies that went well beyond his formal training as a line pilot.
With political mistakes such as Watergate, certain individuals acted in ways that virtually guaranteed that any discovery of the situation would yield results which were increasingly more damaging and required increasingly more elaborate acts of deception.
One can speculate that we will find the same pattern of M3 failure exposed at some point in the future with Firestone. There are already allegations of manufacturing quality control problems, failure to consistently meet specifications, disagreements over recommended operating parameters and disregard for early warning signs of failures. All that remains is to find out who knew what when and how much was covered up in an attempt to “minimize damage.”
What, then, can or should you do as a manager, executive or operator of any piece of a complex organization or machine? The first step is to recognize and acknowledge that the probability of serious consequences from a single mistake is quite low. Most of the world’s classic blunders are an unbroken chain of events – an inability to manage multiple mistakes. That is why Toyota auto plants, for example, equip their production lines with red cords that can be pulled by any worker if problems arise on the line. It’s easier to correct a mistake early on than face the recall of thousands of defective vehicles or parts further into the manufacturing process.
Second, you need to make M3 part of your lexicon. The benefit/cost ratio of breaking the mistake chain early in most cases is almost infinite. Third, create an atmosphere that allows mistakes to be discovered and corrected in a positive fashion. Fourth, use case studies and examples to educate others to the danger of not managing multiple mistakes. Fifth, make it the personal responsibility of every individual in the organization to identify mistakes and “stop the production line.”
If the steps above sound similar to total quality management, kaizen and other ‘90s buzzwords, they are. The big difference is that most of the quality movement has historically focused on narrow definitions of processes and errors that are largely physical in nature. Extrapolating the concept to broad organizational and systems concepts is very different than looking for the flaw in the individual widget coming off the production line. What if a series of mistakes in analyzing markets, understanding customers, ignoring feedback and misdirecting investments led you to manufacture the “perfect” widget” whose only flaw was that no one wanted to buy it? This is where traditional definitions of quality and M3 are very different. Start counting the mistakes that Firestone made and ask yourself how many there are and when they might have been avoided. Then ask yourself, “Is M3 needed in my organization?”