Information Failures and Organizational Disasters

INTELLIGENCE: RESEARCH BRIEF: Vigilance is the key to avoiding potential organizational nightmares.

Reading Time: 12 min 

Topics

Permissions and PDF

In September 2004, Merck & Co. Inc. initiated the largest prescription drug withdrawal in history. After more than 80 million patients had taken Vioxx for arthritis pain since 1999, the company withdrew the drug because of an increased risk of heart attack and stroke. As early as 2000, however, theNew England Journal of Medicine published the results of a Merck trial which showed that patients taking Vioxx were four times as likely to have a heart attack or stroke as patients taking naproxen, a competing drug. Yet Merck kept the product on the market for four more years, despite mounting evidence from a variety of sources that use of Vioxx was problematic. Not until 2004, when Merck was testing whether Vioxx could be used to treat an unrelated disease, did Merck decide to withdraw Vioxx, after an external panel overseeing the clinical trial recommended stopping it because patients on the drug were twice as likely to have a heart attack or stroke as those on a placebo.

Merck’s voluntary withdrawal of Vioxx is emblematic of how most organizational disasters incubate over long gestation periods, during which errors and warning signs build up. While these signals become painfully clear in hindsight, the challenge for organizations is to develop the capability to recognize and treat these precursor conditionsbefore they tailspin into failure. Research on the topic provides a theoretical basis to explain why such disasters happen and highlights information practices that can reduce the risk of catastrophic failure.

Why Catastrophic Accidents Happen

While human error often precipitates an accident or crisis in an organization, focusing on human error alone misses the systemic contexts in which accidents occur and can happen again in the future (Reason, 1997). Perrow’s Normal Accident Theory (1999) maintains that accidents and disasters are inevitable in complex, tightly coupled technological systems, such as nuclear power plants. In his theory, complex systems show that unexpected interactions between independent failures and tight coupling between subsystems propagate and escalate initial failures into a general breakdown —a combination that makes accidents seem inevitable or “normal.” Rasmussen (1997) argues that accidents can happen when work practices drift or migrate under the influence of two sets of forces: The desire to complete work with a minimum expenditure of mental and physical energy (moving work practices toward the least effort) and management pressure (which moves work practices toward a minimum expenditure of resources).

Topics

Reprint #:

46303

More Like This

Add a comment

You must to post a comment.

First time here? Sign up for a free account: Comment on articles and get access to many more articles.