How to Make Sense of Weak Signals

There’s no sense in denying it: interpreting weak signals into useful decision making takes time and focus. These three stages can help you see the periphery—and act on it—much more clearly.

Reading Time: 10 min 

Topics

Permissions and PDF

Image courtesy of Flickr user hitchster.

“When people stumble onto the truth they usually pick themselves up and hurry about their business.”—attributed to Winston Churchill

It’s the question everyone wants answered: Why did so many smart people miss the signs of the collapse of the subprime market? As early as 2001, there were many danger signals about the impending housing bubble and the rampant use of derivatives. Yet these signals were largely ignored by such financial players as Northern Rock, Countrywide, Bear Stearns, Lehman Brothers and Merrill Lynch until they all had to face the music harshly and abruptly. Some players were more prescient, however, and sensed as well as acted on the early warning signals. In 2003, investment guru Warren Buffett foresaw that complex derivatives would multiply and mutate until “some event makes their toxicity clear.” In 2002, he derided derivatives as financial weapons of mass destruction. Likewise, Edward Gramlich, a governor of the Federal Reserve, warned in 2001 about a new breed of lenders luring buyers with poor credit records into mortgages they could not afford.1

Some business leaders also noticed. Hedge-fund honcho John Paulson spotted “overvalued” credit markets in 2006 and made $15 billion in 2007 by shorting subprime. In July 2006, the chief U.S. economist at The Goldman Sachs Group Inc. warned that “nominal U.S. home prices may be headed for an outright decline in 2007. It would be the first decline in national home prices ever recorded, at least in nominal terms.” And in early 2007, his colleague further warned that “there are signals of a decrease in mortgage lending criteria and initial signals of financial troubles from subprime lenders.”2 Likewise, the board of the Dutch bank ABN AMRO Holding N.V. recognized the looming problems facing the banking sector, and sold itself. Shareholders did very well, collecting about $100 billion before it all fell apart, with Fortis SA/NV and others in the syndicate in ruin.3

The leading question

How can managers develop their peripheral vision to see what’s ahead more sharply?

Findings
  • Managers who can identify and minimize both their personal and organizational biases are less likely to get blindsided.
  • Catching and capturing distant threats and opportunities means applying different search methods—and looking for overlapping results.

Topics

References

1. E.L. Andrews, “Fed Shrugged as Subprime Crisis Spread,” New York Times, Dec. 18, 2007; P. Barrett, “Wall Street Staggers,” Business Week, Sept. 29, 2008, 28-31; and N.D. Schwartz and V. Bajaj, “How Missed Signs Contributed to a Mortgage Meltdown,” New York Times, Aug. 19, 2007.

2. These and other warnings were sounded by Jan Hatzius, chief U.S. economist at Goldman Sachs, July 30, 2006; Dan Sparks, mortgage department, Goldman Sachs, The Times, Jan. 2007; and again by Jan Hatzius on Feb. 12, 2007, at a Goldman Sachs housing conference.

3. Board member interview with authors; see also a detailed account in Dutch by P. Battes and P. Elshout, “De val van ABN AMRO” (Amsterdam: Business Contact, 2008).

4. R.J. Shiller, “Challenging the Crowd in Whispers, Not Shouts,” New York Times, Nov. 2, 2008, p. 5.

5. For a managerial overview of the extensive field of decision making, see J.E. Russo and P.J.H. Schoemaker, “Winning Decisions” (New York: Doubleday Publishing Co., 2001).

6. The space shuttle data oversights are discussed in S.R. Dalal, E.B. Fowlkes and B. Hoadley, “Risk Analysis of the Space Shuttle: Pre-Challenger Prediction of Failure,” Journal of the American Statistical Association 84, no. 408 (December 1989): 945-957; and E.R. Tufte, chap. 2 in “Visual and Statistical Thinking: Displays of Evidence for Making Decisions” (Cheshire, Connecticut: Graphics Press, 1997). The groupthink explanation of the Challenger case, and the associated tendency toward excessive risk taking, are examined in J.K. Esser and J.S. Lindoerfer, “Groupthink and the Space Shuttle Challenger Accident: Toward a Quantitative Case Analysis,” Journal of Behavioral Decision Making 2, no. 3 (1989): 167-177. An organizational and cultural account is offered in an excellent field study by D. Vaughn, “The Challenger Launch Decision” (Chicago: University of Chicago Press, 1996).

7. R. Wohlstetter, “Pearl Harbor: Warning and Decisions” (Stanford, California: Stanford University Press, 1962); and G. Prange, “At Dawn We Slept” (New York: Penguin Books, 1981).

8. The biases mentioned here reflect multiple research streams that are too broad to cite fully. We suffice by listing some of the classic references, such as L. Festinger, “Conflict, Decision and Dissonance” (Stanford, California: Stanford University Press, 1964); I. Janis, “Groupthink: Psychological Studies of Policy Decisions and Fiascos,” 2nd ed. (Boston: Houghton Mifflin, 1982); I.L. Janis and L. Mann, “Decision Making: A Psychological Analysis of Conflict, Choice and Commitment” (New York: Free Press, 1977); and H.H. Kelley and J.L. Michela, “Attribution Theory and Research,” Annual Review of Psychology 31 (1980): 457-501.

9. The original and classic reference on groupthink is I. Janis, “Groupthink: Psychological Studies of Policy Decisions and Fiascos,” 2nd ed. (Boston: Houghton Mifflin, 1982). For a critical review of groupthink as a psychological model, see W.W. Park, “A Review of Research on Groupthink,” Journal of Behavioral Decision Making 3 (1990): 229-245.

10. The special challenges of organizational coordination and distortion are addressed in C.A. Heimer, “Social Structure, Psychology and the Estimation of Risk,” Annual Review of Sociology 14 (1988): 491-519; E. Hutchins and T. Klausen, “Distributed Cognition in an Airline Cockpit,” in “Cognition and Communication at Work,” eds. D. Middleton and Y. Engstrom (Cambridge, U.K.: Cambridge University Press, 1996); and K.E. Weick and K.H. Roberts, “Collective Mind in Organizations: Heedful Interrelating on Flight Decks,” Administrative Science Quarterly 38 (1993): 357-381.

11. “A Vital Job Goes Begging,” New York Times, Feb. 12, 2005, Sec. A, p. 30.

12. Some classic sociological studies on organizational sense making include C. Perrow, “Normal Accidents: Living with High-Risk Technologies” (Princeton, New Jersey: Princeton University Press, 1999); and M. Douglas, “How Institutions Think,” (Syracuse, New York: Syracuse University Press, 1986). See also L.B. Clarke and J.F. Short Jr., “Social Organization and Risk: Some Current Controversies,” Annual Review of Sociology 19 (1993): 375-99; and L.B. Clarke, “Mission Improbable: Using Fantasy Documents to Tame Disaster” (Chicago: University of Chicago Press, 2001).

13. How organizations can maintain high reliability of performance in complex environments is addressed in E. Roth, J. Multer and T. Raslear, “Shared Situation Awareness As a Contributor to High Reliability Performance in Railroad Operations,” Organization Studies 27, no. 7 (2006): 967-987; see also K.H. Roberts, “Some Characteristics of One Type of High Reliability Organization,” Organization Science 1, no. 2 (1990): 160-176.

14. K.H. Roberts, “Managing High Reliability Organizations,” California Management Review 32 (1990): 101-113; G.A. Bigley and K.H. Roberts, “The Incident Command System: High-Reliability Organizing for Complex and Volatile Task Environments,” Academy of Management Journal 44, no. 6 (2001): 1281-1299; E. Hutchins and T. Klausen, “Distributed Cognition in an Airline Cockpit,” in “Cognition and Communication at Work,” eds. D. Middleton and Y. Engstrom (Cambridge, U.K.: Cambridge University Press, 1996); and K.E. Weick and K.H. Roberts, “Collective Mind in Organizations: Heedful Interrelating on Flight Decks,” Administrative Science Quarterly 38 (1993): 357-381.

15. F.K. Boersma, “The Organization of Industrial Research as a Network Activity: Agricultural Research at Philips in the 1930s,” Business History Review 78, no. 2 (2004): 255-72; F.K. Boersma, “Structural Ways to Embed a Research Laboratory Into the Company: A Comparison Between Philips and General Electric 1900–1940,” History and Technology 19, no. 2 (2003): 109-126.

16. M.W. Dupree, book review of L. Galambos and J.E. Sewell, “Networks of Innovation: Vaccine Development at Merck, Sharp & Dohme, and Mulford, 1895-1995,” Business History, Oct. 1, 1997.

17. A classic philosophical treatment of different approaches to gathering and interpreting information is C.W. Churchman’s book “The Design of Inquiring Systems” (New York: Basic Books, 1971).

18. Sir Kevin Tebbit, interview with authors; also, see P. Bose, “Alexander the Great’s Art of Strategy” (New York: Gotham Books, Penguin Group [USA] Inc., 2003).

19. From a brief discussion of the book in Wired, www.wired.com/wired/archive/12.06/view_pr.html.

20. For more detail on the case, see P.J.H. Schoemaker, “Profiting from Uncertainty” (New York: Free Press, 2002).

21. Royal Dutch Shell used scenario planning as a learning process to help reveal the implicit mental models in its organization. This form of institutional learning can be seen as a way for management teams to “change their shared models of their company, their markets and their competitors.” A.P. de Geus, “Planning as Learning,” Harvard Business Review 66 (March-April 1988): 70-74.

22. This was the original title of an internal Shell paper by Pierre Wack, the main founder of Shell’s approach to scenario planning. The paper was later revised and published as two articles: P. Wack, “Scenarios: Uncharted Waters Ahead,” Harvard Business Review 63, no. 5 (September-October 1985): 73-89; and P. Wack, “Scenarios: Shooting the Rapids,” Harvard Business Review 63, no. 6 (November-December 1985): 139-150.

23. L. Bossidy and R. Charan, “Confronting Reality,” Fortune, Oct.18, 2004, 225-229, excerpted from “Confronting Reality: Doing What Matters to Get Things Right” (New York: Crown Business, 2004).

24. K.A. Jehn, “A Multimethod Examination of the Benefits and Detriments of Intragroup Conflict,” Administrative Science Quarterly 40, no. 2 (June 1995): 256-282. For an excellent discussion of management conflict and performance, see K.M. Eisenhardt, J.L. Kahwajy and L.J. Bourgeois III, “Conflict and Strategic Choice: How Top Management Teams Disagree,” California Management Review 39, no. 2 (winter 1997): 42-62.

25. G. Morgenson, “How the Thundering Herd Faltered and Fell,” New York Times, Sunday, Nov. 9, 2008.

26. G. Klein, “Sources of Power” (Cambridge, Massachusetts: MIT Press, 1998); also see R.M. Hogarth, “Educating Intuition” (Chicago: University of Chicago Press, 2001).

27. V. Barabba, “Surviving Transformation: Lessons from GM’s Surprising Turnaround” (New York: Oxford University Press, 2004).

28. These unmet needs were identified in a study by Wirthlin Worldwide Inc. through two measures—the importance consumers placed on key factors that influenced their buying decisions and their current level of satisfaction with these factors.

29. This example is more fully discussed in P.J.H. Schoemaker and M.V. Mavaddat, “Scenario Planning for Disruptive Technologies,” chap. 10 in eds. G. Day and P.J.H. Schoemaker, “Wharton on Managing Emerging Technologies” (New York: Wiley, 2000).

30. See M. Neugarten, “Seeing and Noticing: An Optical Perspective on Competitive Intelligence,” Journal of Competitive Intelligence and Management 1, no. 1 (spring 2003): 93-104.

Reprint #:

50317

More Like This

Add a comment

You must to post a comment.

First time here? Sign up for a free account: Comment on articles and get access to many more articles.

Comments (6)
The future of startups is no startups – Benchmarking e-government in web 2.0
[…] This is a bold statement, designed at stirring discussion. But it is also based on today’s weak signals. […]
How Scenario Planning Influences Strategic Decisions – neyeblog
[…] Enron Corp. as its sole corporate sponsor. In their 2009 MIT Sloan Management Review article, “How to Make Sense of Weak Signals,” Paul J.H. Schoemaker and George S. Day described how, after Enron’s sudden collapse into […]
From Poverty to Power » What has cancer taught me about the links between medicine and development? Guest post by Chris Roche
[…] and using the ‘natural variation’ in both patients and their treatment to identify ‘signals’. ‘So what we are seeing is really just some small signals. We do what we call a discovery, where […]
Alain Coetmeur
Thanks for that interesting that present a much better vision of weak signal than usually proposed. As a (benevolent) tech-watcher on a massive weak-signal about a diruptive technology, I recognize that situation.

Weak signal are not like small noise in a system, it is the interpretation of massive volume of data filtered by a bigger Groupthink.

On the french wikipedia there is an article on weak signal -- https://fr.wikipedia.org/wiki/Signaux_faibles -- much more challenging than on US version

"According to Olivier Mevel , the "weak signals" are partial and incomplete information provided by the environment, possibly in parallel of strong signals, which are carriers of a specific "order" and recognized as such by the organization after appropriate treatment. Weak signals are signals of small frequency ... even concealed signals, but deduced from information or fact. The importance of weak signals is not in their perception that rarely makes the first degree, but they trigger as contrasting reactions and contribute to imagine dynamic scenarios. Contrary to popular belief, weak signals are mostly extrapolations deduction especially in sense a priori impossible even detestable. 
...

In other words, weak signals are awakening and not watching!

I like the term "detestable" which match the reaction of mainstream to weak signal... like the reaction of Finance to Roubini, or the reaction i see today on few scientific subjects.

Roland Benabou in his papers of the Groupthink serie describes very well that situation under the term "Mutual Assured Delusion".

I recognize everyday the "rationalization" which allow to ignore dissenting data, the "it is evident " that hide the absence of any evidence. I also recognize on the news/social-networks, when there is a situation of multiple cognitive equilibrium. The most accepted are the conspiracy theories against mainstream, but the most hidden are the mainstream conspiracy theories (mainstream by academics, by media, sometime both) against evidence based challengers.

The article of Roland Benabou : Groupthink: Collective Delusions in Organizations and Markets (Roland Benabou - Forthcoming in the Review of Economic Studies) is really one to read.

From few years of practice in controversial domain I have observed that the network (media, academics, peer-journals) that dispatch information across the population, the decision makers, is an horribly biased, self-referring, unethical, filter and warper of facts. I also discovered that countrary to the myth, the academic are among the most biased against data, and that strangely business (better for smaller business) are less deforming the data, and less terrorizing the dissenters.

The worst bias is caused by sincere desire to make a better World, better Science, better Planet, better Humanity, and not to make money. It join the finding of Milgram experiment that low morality people refuse to commit horrific acts, because they don't commit them for a cause.

Finding Weak Signal  in the mass of good willing biased propaganda, should probably look at data from money-driven, selfish-interested communities, and should be careful with good willing priest. The road to ruin is paved with good intentions.

thanks again for that article, and I hope that I will have convinced you to look at Benabou work, and to be careful on any "consensus".

AlainCo the tech watcher of... a weak signal forum...
inveniam
I did my PhD dissertation on theories of strategic surprise in the US intelligence community.  I found that one can make the case that more effort should be put into examining ones assumptions and guiding hypotheses than into looking for weak signals.  The signals you detect, after all, are a function of those assumptions and hypotheses.  Theories of strategic surprise make the point again and again that after a surprise the signals that might have forewarned people existed, but no one was watching for them.  

For more on this critique, see Philippe Silberzahn's piece "Competitive intelligence and strategic surprises: Why monitoring weak signals is not the right approach" at http://bit.ly/yl9KmA  

For a technique for sorting through hypotheses, see "Business and Intelligence Techniques: the Role of Competing Hypotheses" at: http://bit.ly/w3mvbl
markramo
Director Schoemaker, thank you for this insightful and thought provoking article. Throughout years of employment experience, after being laid off, I learned to pay attention to warning signals, or "yellow flags." For example, the project you're on is getting less funding in the next year. Do you ignore it and put blinders on? Or start tuning up your resume? The correct answer is tune up the resume. 
So, in a crude way, I developed this awareness and it helped me escape situations that were destined for failure. Certainly not foolproof, but the percentage over time is positive. So,your article resonates and is even stimulating in terms of research ideas, etc.

It would be an intereting hypothesis to test, but often I feel that groupthink doesn't emerge out of the vapor, rather, there is a source, a leader, a tiny power group who starts the ball rolling. Of course, subordinates eager to please, "yes" their way through the day and one can see over time, if the organization is good, arrogance, blind faith and all the rest create the environment for groupthink. Even at the micro level, because I tend to be independent, I've seen others follow the low level leader for fear that they might draw their wrath with an honest disagreement and it's not long before everyone is just nodding their heads. Some of the "enemies" (few) I've created by disagreeing with their infallibility is humorous. Point being, I'd like to see more about how "the leader" impacts "groupthink." 

Thank you for reading, Mark Ramos, engineer, MBA student, University of Phoenix