In May 2000, events overwhelmed a fire crew working to burn out an overgrown 300-acre area at the Bandelier National Monument in New Mexico. A tiny patch of flame, which kept flaring up every time firefighters thought they had put it out, eventually escaped and grew into the Cerro Grande wildfire, one of the most devastating in our nation’s history and the cause of $1 billion in damages to the city of Los Alamos and the adjacent Los Alamos National Laboratories. Eighteen thousand people were evacuated; and two weeks later, by the time the fire was finally stopped, some 47,000 acres had been consumed and 300 homes and laboratory buildings destroyed.1
Like most disasters, many factors contributed to Cerro Grande. But an important one was what we call “dysfunctional momentum,” which occurs when people continue to work toward an original goal without pausing to recalibrate or reexamine their processes, even in the face of cues that suggest they should change course. What happened to the firefighters in New Mexico is not unusual. Dysfunctional momentum arises daily in organizations, and sometimes with dreadful results. The members of a project that spiraled out of control look back and wonder: How did we get there? How did we miss the cues that might have signaled huge problems ahead? Or, if they did see the cues: Why didn’t we change course?
Business disasters, like many wildfires, often start out small, with little problems that managers notice but don’t worry about too much. But what happens when small “fires” start to accumulate, grow or change direction? The fact is that when we’re in the middle of the action, we often get so engrossed in what we’re doing we don’t notice that things have changed, or we ignore signals suggesting we should alter our course. And the next thing we know, we’re faced with a full-fledged calamity.
Company managers can learn a lot about preventing dysfunctional momentum, and ultimately avoiding business disasters, from people whose everyday job is to manage complex and volatile situations. People involved in high-hazard work such as firefighting have to be more vigilant about emerging problems not only because of their responsibilities to the public but also because their lives depend on it.
1. K.E. Weick and K.M. Sutcliffe, “Managing the Unexpected: Resilient Performance in an Age of Uncertainty” (San Francisco: Jossey-Bass, 2007), 2.
2. C. Perrow, “Complex Organizations: A Critical Essay,” 3rd ed. (New York: Random House, 1986).
3. C. Perrow, “Normal Accidents: Living with High-Risk Technologies” (New York: Basic Books, 1984).
4. S. Fiske and S. Taylor, “Social Cognition: From Brains to Culture,” 2nd ed. (New York: McGraw-Hill, 1991).
5. D. Vaughan, “The Challenger Launch Decision: Risky Technology, Culture and Deviance at NASA” (Chicago: University of Chicago Press, 1996), 124, 141, 143, 179.
6. B.A. Turner, “Organizational and Interorganizational Development of Disasters,” Administrative Science Quarterly 21, no. 3 (September 1976): 378-397.
7. K.E. Weick, “The Collapse of Sensemaking in Organizations: The Mann Gulch Disaster,” Administrative Science Quarterly 38, no. 4 (December 1993): 628-652.
8. A. Edmondson, “Psychological Safety and Learning Behavior in Work Teams,” Administrative Science Quarterly 44, no. 2 (June 1999): 350-383.
9. J. Dewey, “Human Nature and Conduct,” 2nd ed. (Mineola, New York: Dover, 2002), 178.
10. Weick, “Managing the Unexpected Complexity,” 161.
11. R.L. Daft and R.H. Lengel, “Information Richness: A New Approach to Manager Information Processing and Organization Design,” in “Research in Organizational Behavior,” ed. B.M. Staw and L.L. Cummings (Greenwich, Connecticut: JAI Press, 1984), 191-233.
i. K.M. Eisenhardt, “Building Theories from Case Study Research,” Academy of Management Review 14, no. 4 (October 1989): 532-550.