We’ve all heard the parable of the frog that is placed in a pot of water and then boils to death as the water is gradually heated. Though the parable has since been debunked, it still serves as a rich metaphor for describing how our finest organizations and executives sometimes get themselves into trouble.1 To extend the metaphor further, many of our most popular business articles and books can be said to have focused, in one way or another, on how to avoid getting boiled to death. Leavitt’s “Marketing Myopia” is a classic in this regard.2 How could buggy whip manufacturers be so shortsighted as to not see that the age of the automobile would wipe out their market? Why did they stubbornly refuse to think more broadly about the market they served? More recently, Clayton Christensen advances the idea that “disruptive technologies” can catch successful companies off guard, causing them to fail.3 Overcommitment to a course of action (even one that has worked in the past and now may be working) is likely to make executives miss certain warning signs or misinterpret them when they do appear. Returning to the frog parable, executives are apt to ignore the fact that the temperature around them is gradually rising. Like real frogs, however, executives can escape from their pot of hot water. The problem is that, in many cases, executives become so strongly wedded to a particular project, technology, or process that they find themselves continuing when they should pull out. Instead of terminating or redirecting the failing endeavor, managers frequently continue pouring in more resources.4 Management scholars call this escalation of commitment to a failing course of action.5 While escalation of commitment is a general phenomenon, it is particularly common in technologically sophisticated projects with a strong information technology (IT) component. The special nature of these projects—including the high complexity, risk, and uncertainty they involve—makes them particularly susceptible to escalation. Consider the following example. In 1992, California’s Department of Social Services (DSS) began a project to develop the Statewide Automated Child Support System (SACSS). The development work was contracted out to Lockheed Martin under a $75.5 million contract and was scheduled to take 3 years to complete. But by 1995, the estimated cost of the still unfinished project had grown to $260 million.
1. “Next Time, What Say We Boil a Consultant?” Fast Company, November 1995, p. 20.
2. T. Levitt, “Marketing Myopia,” Harvard Business Review, volume 53, September/October 1975, pp. 26–28, 33–39, 44, 173–181.
3. C.M. Christensen, The Innovator’s Dilemma (Boston: Harvard Business School Press, 1997).
4. B.M. Staw and J. Ross, “Knowing When to Pull the Plug,” Harvard Business Review, volume 65, March–April 1987, pp. 68–74.
5. See, for example:
J. Brockner, “The Escalation of Commitment to a Failing Course of Action: Toward Theoretical Progress,” Academy of Management Review, volume 17, January 1992, pp. 39–61.
6. T. Newcombe, “Big Project Woes Halt Child Support System,” Government Technology, February 1998, pp. 34–35.
7. For an overview of the software project failure problem, see:
W.W. Gibbs, “Software’s Chronic Crisis,” Scientific American, volume 271, September 1994, pp. 86–95.
The following articles provide examples of software project escalation:
G.H. Anthes, “IRS Project Failures Cost Taxpayers $50B Annually,” Computerworld, 14 October 1996, p. 1;
M. Betts, “Feds Debate Handling of Failing IS Projects,” Computerworld, 2 November 1992, p. 103;
V. Ellis, “Audit Says DMV Ignored Warning,” Los Angeles Times, 18 August 1994, pp. A3, A24;
J. Johnson, “Chaos: The Dollar Drain of IT Project Failures,” Application Development Trends, volume 2, January 1995, pp. 41–47;
S. Kindel, “The Computer That Ate the Company,” Financial World, volume 161, 31 March 1992, pp. 96–98;
J. Rothfeder, “It’s Late, Costly, Incompetent—But Try Firing a Computer System,” Business Week, 7 November 1988, pp. 164–165;
R. Tomsho, “Real Dog: How Greyhound Lines Re-Engineered Itself Right Into a Deep Hole,” Wall Street Journal, 20 October 1994, pp. A1–A6; and
T. Newcombe, “Big Project Woes Halt Child Support System,” Government Technology, February 1998, pp. 34–35.
8. M. Keil and J. Mann, “The Nature and Extent of Information Technology Project Escalation: Results from a Survey of IS Audit and Control Professionals,” IS Audit & Control Journal, volume 1, 1997, pp. 40–48.
9. De-escalation can be said to have occurred whenever there is reduced commitment to a previously chosen course of action. In the case of project failure, such a reduction in commitment could manifest itself as project abandonment, but it could also be manifested in the form of significant changes made in relation to some previous course of action. Thus, we take a broad view of de-escalation in this paper, defining it so as to include abandonment as well as project redirection.
10. For references to prior studies, see:
M. Keil and D. Robey, “Turning Around Troubled Software Projects: An Exploratory Study of the Deescalation of Commitment to Failing Courses of Action, Journal of Management Information Systems, volume 15, Spring 1999, pp. 63–87.
11. This research was conducted through a combination of surveys, in-depth case studies, and an examination of published accounts of IT project escalation. See:
Keiland Mann (1997);
Keil and Robey (1999);
M. Keil, “Pulling the Plug: Software Project Management and the Problem of Project Escalation,” MIS Quarterly, volume 19, December 1995, pp. 421–447;
R. Montealegre and M. Keil, “Denver International Airport’s Automated Baggage Handling System: A Case Study of De-escalation of Commitment,” Academy of Management Best Papers Proceedings 1998, pp. D1–D9 (www.aom.pace.edu);
R. Montealegre and M. Keil, “De-Escalating Information Technology Projects: Lessons from the Denver International Airport,” MIS Quarterly, forthcoming; and
H. Drummond, Escalation in Decision-Making: The Tragedy of Taurus (New York: Oxford University Press, 1996).
12. The results of this study have been reported elsewhere and will only be summarized here. See:
Montealegre and Keil (1998); and
Montealegre and Keil (forthcoming).
For additional information on the Denver International Airport case, see:
R. Montealegre, C. Knoop, J. Nelson, and L.M. Applegate, “BAE Automated Systems (A): Implementing the Denver International Airport Baggage-Handling System” (Boston: Harvard Business School Case 9-396-311, 1996); and
R. Montealegre, C. Knoop, J. Nelson, and L.M. Applegate“BAE Automated Systems (B): Implementing the Denver International Airport Baggage-Handling System” (Boston, Massachusetts: Harvard Business School Case 9-396-312, 1996).
13. J. Bouton, “State-of-the-Art Baggage System for Denver,” Airport Forum, February 1993, pp. 10–13.
14. P. O’Driscoll, “ ‘Low Tech’ Salvation: Webb Orders Backup Bag System,” Denver Post, 5 August 1994, p. A-1.
15. Under normal circumstances, two to three weeks would often elapse between the buying and selling of shares and the actual exchange of money and stock certificates, making the market both riskier and less efficient than desirable. When the 1997 stock market crash occurred, there was a settlements crisis resulting in nearly one million unsettled share transactions. Trades remained unsettled for months.
16. H. Drummond, “ ‘It Looked Marvellous in the Prospectus’: TAURUS, Information and Decision Making,” Journal of General Management, volume 23, Spring 1998, pp. 73–87.
17. It was projected that Taurus would save the equities industry $225 million over 10 years. The Taurus project has been well documented by H. Drummond, and we draw heavily upon her work in the analysis presented here. See:
H. Drummond, Escalation in Decision-Making: The Tragedy of Taurus New York: Oxford University Press, 1996).
18. Ibid., p. 64.
19. Ibid., p. 74.
20. Vista refused to enter into a fixed-price contract for the modifications.
21. Drummond (1996), p. 60.
22. Drummond (1998), p. 76.
23. H. Drummond, “Are We Any Closer to the End? Escalation and the Case of Taurus,” International Journal of Project Management, volume 17, February 1999, pp. 11–16.
24. Drummond (996), p. 138.
25. Ibid., p. 137.
26. Ibid., p. 141.
27. Ibid., p. 138.
28. Ibid., p. 139.
29. Ibid., pp. 147–148.
30. Ibid., p. 69.
31. Ibid., p. 70.
32. Ibid., p. 114.
33. Ibid., p. 148.
35. Ibid., p. 150.
36. Ibid., p. 151.
38. Ibid., p. 155.
39. Ibid., p. 152.
40. Ibid., p. 151.
41. Staw and Ross (1987).
42. This finding is consistent with previous literature. See, for example:
J.Z. Rubin and J. Brockner, “Factors Affecting Entrapment in Waiting Situations: The Rosencrantz and Guildenstern Effect,” Journal of Personality and Social Psychology, volume 31, June 1975, pp. 1054–1063;
J. Brockner, M.C. Shaw, and J.Z. Rubin, “Factors Affecting Withdrawal from an Escalating Conflict: Quitting Before It’s Too Late,” Journal of Experimental Social Psychology, volume 15, September 1979, pp. 492–503;
B.E. McCain, “Continuing Investment Under Conditions of Failure: A Laboratory Study of the Limits to Escalation,” Journal of Applied Psychology, volume 71, May 1986, pp. 280–284;
G. B. Northcraft and M. A. Neale, “Opportunity Costs and the Framing of Resource Allocation Decisions,” Organizational Behavior and Human Decision Processes, volume 37, June 1986, pp. 348–356; and
M. Keil, R. Mixon, T. Saarinen, and V. Tuunainen, “Understanding Runaway Information Technology Projects: Results from an International Research Program Based on Escalation Theory,” Journal of Management Information Systems, volume 11, Winter 1995, pp. 67–87.
43. R. Arnheim, Art and Visual Perception: The Psychology of the Creative Eye (Berkeley, California: University of California Press, 1954); and
R. Weisberg, Creativity: Genius and Other Myths (New York: W.H. Freeman, 1986).
44. D. J. Couger, Creativity & Innovation in Information Systems Organizations (Danvers, Massachusetts: Boyd & Fraser, 1996).
45. B.M. Staw and J. Ross, “Behavior in Escalation Situations: Antecedents, Prototypes, and Solutions,” in Research in Organizational Behavior, volume 9, B.M. Staw and L.L. Cummings, eds. (Greenwich, Connecticut: JAI Press, 1987), p. 55.
46. One means of saving face in the midst of such a crisis is to exercise various impression management techniques. See, for example:
C. L. Iacovou and A.S. Dexter, “Explanations Offered by IS Managers to Rationalize Project Failures” (Vancouver: University of British Columbia, Management Information Systems Division, Working Paper 96-MIS-003, August 1996); and
M.L. Leatherwood and E.J. Conlon, “Diffusibility of Blame:Effects on Persistence in a Project,” Academy of Management Journal, volume 30, December 1987, pp. 836–847.
47. Drummond (1996), p. 151.
48. J. Ross and B.M. Staw, “Organizational Escalation and Exit: Lessons from the Shoreham Nuclear Power Plant,” Academy of Management Journal, volume 36, August 1993, pp. 701–732.
We are indebted to the employees of the city of Denver, BAE, other contractors who worked on the Denver International Airport, and the airlines for their willingness to discuss the circumstances surrounding the computerized baggage-handling system. We would also like to thank Helga Drummond for providing such a rich case study of the Taurus system. It is rare to find examples of IT failures that have been documented so thoroughly. Finally, we gratefully acknowledge the support of the Robinson College of Business at Georgia State University and the College of Business at the University of Colorado for providing summer research grants to pursue this project.