How to Make Sense of Weak Signals
There’s no sense in denying it: interpreting weak signals into useful decision making takes time and focus. These three stages can help you see the periphery—and act on it—much more clearly.
“When people stumble onto the truth they usually pick themselves up and hurry about their business.”—attributed to Winston Churchill
It’s the question everyone wants answered: Why did so many smart people miss the signs of the collapse of the subprime market? As early as 2001, there were many danger signals about the impending housing bubble and the rampant use of derivatives. Yet these signals were largely ignored by such financial players as Northern Rock, Countrywide, Bear Stearns, Lehman Brothers and Merrill Lynch until they all had to face the music harshly and abruptly. Some players were more prescient, however, and sensed as well as acted on the early warning signals. In 2003, investment guru Warren Buffett foresaw that complex derivatives would multiply and mutate until “some event makes their toxicity clear.” In 2002, he derided derivatives as financial weapons of mass destruction. Likewise, Edward Gramlich, a governor of the Federal Reserve, warned in 2001 about a new breed of lenders luring buyers with poor credit records into mortgages they could not afford.1
Some business leaders also noticed. Hedge-fund honcho John Paulson spotted “overvalued” credit markets in 2006 and made $15 billion in 2007 by shorting subprime. In July 2006, the chief U.S. economist at The Goldman Sachs Group Inc. warned that “nominal U.S. home prices may be headed for an outright decline in 2007. It would be the first decline in national home prices ever recorded, at least in nominal terms.” And in early 2007, his colleague further warned that “there are signals of a decrease in mortgage lending criteria and initial signals of financial troubles from subprime lenders.”2 Likewise, the board of the Dutch bank ABN AMRO Holding N.V. recognized the looming problems facing the banking sector, and sold itself. Shareholders did very well, collecting about $100 billion before it all fell apart, with Fortis SA/NV and others in the syndicate in ruin.3
The leading question
How can managers develop their peripheral vision to see what’s ahead more sharply?
- Managers who can identify and minimize both their personal and organizational biases are less likely to get blindsided.
- Catching and capturing distant threats and opportunities means applying different search methods—and looking for overlapping results.
1. E.L. Andrews, “Fed Shrugged as Subprime Crisis Spread,” New York Times, Dec. 18, 2007; P. Barrett, “Wall Street Staggers,” Business Week, Sept. 29, 2008, 28-31; and N.D. Schwartz and V. Bajaj, “How Missed Signs Contributed to a Mortgage Meltdown,” New York Times, Aug. 19, 2007.
2. These and other warnings were sounded by Jan Hatzius, chief U.S. economist at Goldman Sachs, July 30, 2006; Dan Sparks, mortgage department, Goldman Sachs, The Times, Jan. 2007; and again by Jan Hatzius on Feb. 12, 2007, at a Goldman Sachs housing conference.
3. Board member interview with authors; see also a detailed account in Dutch by P. Battes and P. Elshout, “De val van ABN AMRO” (Amsterdam: Business Contact, 2008).
4. R.J. Shiller, “Challenging the Crowd in Whispers, Not Shouts,” New York Times, Nov. 2, 2008, p. 5.
5. For a managerial overview of the extensive field of decision making, see J.E. Russo and P.J.H. Schoemaker, “Winning Decisions” (New York: Doubleday Publishing Co., 2001).
6. The space shuttle data oversights are discussed in S.R. Dalal, E.B. Fowlkes and B. Hoadley, “Risk Analysis of the Space Shuttle: Pre-Challenger Prediction of Failure,” Journal of the American Statistical Association 84, no. 408 (December 1989): 945-957; and E.R. Tufte, chap. 2 in “Visual and Statistical Thinking: Displays of Evidence for Making Decisions” (Cheshire, Connecticut: Graphics Press, 1997). The groupthink explanation of the Challenger case, and the associated tendency toward excessive risk taking, are examined in J.K. Esser and J.S. Lindoerfer, “Groupthink and the Space Shuttle Challenger Accident: Toward a Quantitative Case Analysis,” Journal of Behavioral Decision Making 2, no. 3 (1989): 167-177. An organizational and cultural account is offered in an excellent field study by D. Vaughn, “The Challenger Launch Decision” (Chicago: University of Chicago Press, 1996).
7. R. Wohlstetter, “Pearl Harbor: Warning and Decisions” (Stanford, California: Stanford University Press, 1962); and G. Prange, “At Dawn We Slept” (New York: Penguin Books, 1981).
8. The biases mentioned here reflect multiple research streams that are too broad to cite fully. We suffice by listing some of the classic references, such as L. Festinger, “Conflict, Decision and Dissonance” (Stanford, California: Stanford University Press, 1964); I. Janis, “Groupthink: Psychological Studies of Policy Decisions and Fiascos,” 2nd ed. (Boston: Houghton Mifflin, 1982); I.L. Janis and L. Mann, “Decision Making: A Psychological Analysis of Conflict, Choice and Commitment” (New York: Free Press, 1977); and H.H. Kelley and J.L. Michela, “Attribution Theory and Research,” Annual Review of Psychology 31 (1980): 457-501.
9. The original and classic reference on groupthink is I. Janis, “Groupthink: Psychological Studies of Policy Decisions and Fiascos,” 2nd ed. (Boston: Houghton Mifflin, 1982). For a critical review of groupthink as a psychological model, see W.W. Park, “A Review of Research on Groupthink,” Journal of Behavioral Decision Making 3 (1990): 229-245.
10. The special challenges of organizational coordination and distortion are addressed in C.A. Heimer, “Social Structure, Psychology and the Estimation of Risk,” Annual Review of Sociology 14 (1988): 491-519; E. Hutchins and T. Klausen, “Distributed Cognition in an Airline Cockpit,” in “Cognition and Communication at Work,” eds. D. Middleton and Y. Engstrom (Cambridge, U.K.: Cambridge University Press, 1996); and K.E. Weick and K.H. Roberts, “Collective Mind in Organizations: Heedful Interrelating on Flight Decks,” Administrative Science Quarterly 38 (1993): 357-381.
11. “A Vital Job Goes Begging,” New York Times, Feb. 12, 2005, Sec. A, p. 30.
12. Some classic sociological studies on organizational sense making include C. Perrow, “Normal Accidents: Living with High-Risk Technologies” (Princeton, New Jersey: Princeton University Press, 1999); and M. Douglas, “How Institutions Think,” (Syracuse, New York: Syracuse University Press, 1986). See also L.B. Clarke and J.F. Short Jr., “Social Organization and Risk: Some Current Controversies,” Annual Review of Sociology 19 (1993): 375-99; and L.B. Clarke, “Mission Improbable: Using Fantasy Documents to Tame Disaster” (Chicago: University of Chicago Press, 2001).
13. How organizations can maintain high reliability of performance in complex environments is addressed in E. Roth, J. Multer and T. Raslear, “Shared Situation Awareness As a Contributor to High Reliability Performance in Railroad Operations,” Organization Studies 27, no. 7 (2006): 967-987; see also K.H. Roberts, “Some Characteristics of One Type of High Reliability Organization,” Organization Science 1, no. 2 (1990): 160-176.
14. K.H. Roberts, “Managing High Reliability Organizations,” California Management Review 32 (1990): 101-113; G.A. Bigley and K.H. Roberts, “The Incident Command System: High-Reliability Organizing for Complex and Volatile Task Environments,” Academy of Management Journal 44, no. 6 (2001): 1281-1299; E. Hutchins and T. Klausen, “Distributed Cognition in an Airline Cockpit,” in “Cognition and Communication at Work,” eds. D. Middleton and Y. Engstrom (Cambridge, U.K.: Cambridge University Press, 1996); and K.E. Weick and K.H. Roberts, “Collective Mind in Organizations: Heedful Interrelating on Flight Decks,” Administrative Science Quarterly 38 (1993): 357-381.
15. F.K. Boersma, “The Organization of Industrial Research as a Network Activity: Agricultural Research at Philips in the 1930s,” Business History Review 78, no. 2 (2004): 255-72; F.K. Boersma, “Structural Ways to Embed a Research Laboratory Into the Company: A Comparison Between Philips and General Electric 1900–1940,” History and Technology 19, no. 2 (2003): 109-126.
16. M.W. Dupree, book review of L. Galambos and J.E. Sewell, “Networks of Innovation: Vaccine Development at Merck, Sharp & Dohme, and Mulford, 1895-1995,” Business History, Oct. 1, 1997.
17. A classic philosophical treatment of different approaches to gathering and interpreting information is C.W. Churchman’s book “The Design of Inquiring Systems” (New York: Basic Books, 1971).
18. Sir Kevin Tebbit, interview with authors; also, see P. Bose, “Alexander the Great’s Art of Strategy” (New York: Gotham Books, Penguin Group [USA] Inc., 2003).
19. From a brief discussion of the book in Wired, www.wired.com/wired/archive/12.06/view_pr.html.
20. For more detail on the case, see P.J.H. Schoemaker, “Profiting from Uncertainty” (New York: Free Press, 2002).
21. Royal Dutch Shell used scenario planning as a learning process to help reveal the implicit mental models in its organization. This form of institutional learning can be seen as a way for management teams to “change their shared models of their company, their markets and their competitors.” A.P. de Geus, “Planning as Learning,” Harvard Business Review 66 (March-April 1988): 70-74.
22. This was the original title of an internal Shell paper by Pierre Wack, the main founder of Shell’s approach to scenario planning. The paper was later revised and published as two articles: P. Wack, “Scenarios: Uncharted Waters Ahead,” Harvard Business Review 63, no. 5 (September-October 1985): 73-89; and P. Wack, “Scenarios: Shooting the Rapids,” Harvard Business Review 63, no. 6 (November-December 1985): 139-150.
23. L. Bossidy and R. Charan, “Confronting Reality,” Fortune, Oct.18, 2004, 225-229, excerpted from “Confronting Reality: Doing What Matters to Get Things Right” (New York: Crown Business, 2004).
24. K.A. Jehn, “A Multimethod Examination of the Benefits and Detriments of Intragroup Conflict,” Administrative Science Quarterly 40, no. 2 (June 1995): 256-282. For an excellent discussion of management conflict and performance, see K.M. Eisenhardt, J.L. Kahwajy and L.J. Bourgeois III, “Conflict and Strategic Choice: How Top Management Teams Disagree,” California Management Review 39, no. 2 (winter 1997): 42-62.
25. G. Morgenson, “How the Thundering Herd Faltered and Fell,” New York Times, Sunday, Nov. 9, 2008.
26. G. Klein, “Sources of Power” (Cambridge, Massachusetts: MIT Press, 1998); also see R.M. Hogarth, “Educating Intuition” (Chicago: University of Chicago Press, 2001).
27. V. Barabba, “Surviving Transformation: Lessons from GM’s Surprising Turnaround” (New York: Oxford University Press, 2004).
28. These unmet needs were identified in a study by Wirthlin Worldwide Inc. through two measures—the importance consumers placed on key factors that influenced their buying decisions and their current level of satisfaction with these factors.
29. This example is more fully discussed in P.J.H. Schoemaker and M.V. Mavaddat, “Scenario Planning for Disruptive Technologies,” chap. 10 in eds. G. Day and P.J.H. Schoemaker, “Wharton on Managing Emerging Technologies” (New York: Wiley, 2000).
30. See M. Neugarten, “Seeing and Noticing: An Optical Perspective on Competitive Intelligence,” Journal of Competitive Intelligence and Management 1, no. 1 (spring 2003): 93-104.
The future of startups is no startups – Benchmarking e-government in web 2.0
How Scenario Planning Influences Strategic Decisions – neyeblog
From Poverty to Power » What has cancer taught me about the links between medicine and development? Guest post by Chris Roche