It often seems that changes and threats come out of nowhere – until we learn later that the signals were there all along and we just didn”t read them correctly. One step toward reading them better is understanding why we misinterpret them in the first place.

Editor’s Note: The following is an excerpt from a work in progress titled “Detecting Weak Signals.” It will be completed for the Spring issue of the Review. The authors aim to identify “methods that will help managers improve the process of surfacing, amplifying and clarifying potentially important weak signals.” Here, they address the factors that inhibit managers from spotting those signals from the start.

It’s the Question everyone wants answered: Why did so many smart people miss the signs of the collapse of the subprime market? There were many danger signals about the impending housing bubble and the rampant use of derivatives as early as 2001. Yet these signals were largely ignored by such financial players as Northern Rock, Countrywide, Bear Sterns, Lehman Brothers and Merrill Lynch until they all had to face the music harshly and abruptly. Some players were more prescient, however. In 2003, investment guru Warren Buffet foresaw that complex derivatives would multiply and mutate until “some event makes their toxicity clear.” In 2002, he derided derivatives as financial weapons of mass destruction.1

The leading question

Why do managers often overlook or misread information that should inform their judgment?

Findings
  • Both individual and organizational biases inhibit the clear interpretation of information during decision making.
  • “Groupthink” is particularly coercive.
  • Decision makers do see signals, but jump to the most convenient conclusion about them.

So, what separates the hapless from the prescient few? Did the siren call of outsize profits and bonuses, coupled with the delusional promises of manageable risk, dull their senses? Was their ability to see sooner and more clearly compromised by the information overload, organizational filters and cognitive biases that afflict sense making in all organizations?

All managers are susceptible to the distortions and biases we saw in the credit crunch of 2008. Organizations get blindsided not so much because decision makers aren’t seeing signals, but because they jump to the most convenient or plausible conclusion, rather than fully considering other interpretations. Our own research suggests that less than 20% of global firms have sufficient capacity to spot, interpret and act on the weak signals of forthcoming threats and opportunities. Such peripheral signals are, by definition, muddied and imprecise.

Image courtesy of Flickr user sean dreilinger.

Seeing them ahead of time is far more challenging than it seems in hindsight. Both individual and organizational biases prevent peripheral but essential signals from getting through. Awareness of them is a first step toward avoiding their traps.

Objectivity Undermined

Among the most well-established of those traps are the biases that underlie how information is filtered, interpreted and often bolstered.2 The net effect of these biases is that we frame a complex or ambiguous issue in a certain way — without fully appreciating other possible perspectives — and then become overconfident about that particular view.

Filtering What we actually pay attention to is very much determined by what we expect to see. Psychologists call this selective perception. If something doesn’t fit, we often distort reality to make it fit our mental model.

Distorted Inference Whatever information passes through our cognitive and emotional filters may be subject to further distortion. One well-known bias is rationalization: interpreting evidence in a way that sustains a desired belief. We fall victim to this, for example, when trying to shift blame for a mistake we made to someone else. Another common interpretation bias is egocentrism, according to which we overemphasize our own role in the events we seek to explain. This tendency is related to the fundamental attribution bias, which causes us to ascribe more importance to our own actions than to those of others or of the environment. We often view our organization as being a more central actor than it really is.

Bolstering Not only do we heavily filter the limited information that we pay attention to, but also we may seek to bolster our case by searching for additional evidence that further confirms our view. To achieve that, we might disproportionately talk to people who already agree with us. Or we may actively look for new evidence that confirms our perspective, the so-called confirmation bias, rather than pursuing a more balanced search strategy that includes disconfirming evidence.

Getting Along, Getting It Wrong

In addition to our personal biases, when we function in organizations we can end up suffering from what Irving Janis termed “groupthink.”3 In principle, groups should be better than individuals at detecting changes and responding to them. But often they are not, especially if the team is not managed well, under pressure and careful not to rock the boat. Information sharing and debate are especially important when it comes to understanding the shadowy signals that lurk at the periphery.

Organizational sense making occurs in a complex social environment in which people are not just sensitive to what is being said but also to who is speaking. Source credibility is influenced by many factors, such as status, past experience, politics and the like. These social biases will be especially strong when the information is weak or incomplete.

The individual and organizational biases discussed above underscore why it is important to bring together different perspectives on the same issue…. [To come, in the Review’s Spring issue: How these different perspectives can be cultivated to enable an organization to make sense of the weak information it receives.]