Changes in today’s business environment pose vexing ethical challenges to executives. We propose that unethical business decisions may stem not from the traditionally assumed trade-off between ethics and profits or from a callous disregard of other people’s interests or welfare, but from psychological tendencies that foster poor decision making, both from an ethical and a rational perspective. Identifying and confronting these tendencies, we suggest, will increase both the ethicality and success of executive decision making.
Executives today work in a moral mine field. At any moment, a seemingly innocuous decision can explode and harm not only the decision maker but also everyone in the neighborhood. We cannot forecast the ethical landscape in coming years, nor do we think that it is our role to provide moral guidance to executives. Rather, we offer advice, based on contemporary research on the psychology of decision making, to help executives identify morally hazardous situations and improve the ethical quality of their decisions.
Psychologists have discovered systematic weaknesses in how people make decisions and process information; these new discoveries and theories are the foundation for this paper. These discoveries involve insights into errors that people make when they estimate risks and likelihoods, as well as biases in the way they seek information to improve their estimates. There are new theories about how easily our preferences can be influenced by the consequences we consider and the manner in which we consider them. Social psychologists have new information about how people divide the world into “us” and “them” that sheds new light on how discrimination operates. Finally, there has been important new research into the dimensions along which people think that they are different from other people, which helps explain why people might engage in practices that they would condemn in others.1
We focus on three types of theories that executives use in making decisions — theories about the world, theories about other people, and theories about ourselves. Theories about the world refer to the beliefs we hold about how the world works, the nature of the causal network in which we live, and the ways in which our decisions influence the world. Important aspects of our theories about the world involve our beliefs about the probabilistic (or deterministic) texture of the world and our perceptions of causation.
Theories about other people are our organized beliefs about how “we” are different from “they.&
1. For the research on which we based this article, see:
M.H. Bazerman, Judgment in Managerial Decision Making (New York: John Wiley, 1994);
R.M. Dawes, Rational Choice in an Uncertain World (San Diego, California: Harcourt Brace Jovanovich, 1988);
T. Gilovich, How We Know What Isn’t So (New York: Free Press, 1991); and
S. Plous, The Psychology of Judgment and Decision Making (New York: McGraw Hill, 1993).
A forthcoming book will explore these and other topics in greater detail. See:
D.M. Messick and A. Tenbrunsel, Behavioral Research and Business Ethics (New York: Russell Sage Foundation, forthcoming).
2. G. Hardin, Filters Against Folly (New York: Penguin, 1985).
3. R. Nader, Unsafe at Any Speed (New York: Grossmans Publishers, 1965).
4. M. Rothbart and M. Snyder, “Confidence in the Prediction and Postdiction of an Uncertain Outcome,” Canadian Journal of Behavioral Science 2 (1970): 38–43.
5. I.L. Janis, Groupthink: Psychological Studies of Policy Decisions and Fiascoes (Boston: Houghton Mifflin, 1982).
6. B. Fischhoff, “Hindsight: Thinking Backward,” Pyschology Today 8 (1975): 71–76.
7. D. Kahneman and A. Tversky, “Prospect Theory: An Analysis of Decision under Risk,” Econometrica 47 (1979): 263–291.
8. Bazerman (1994); and
Kahneman and Tversky (1979).
9. Kahneman and Tversky (1979).
10. A.L. McGill, “Context Effects in the Judgment of Causation,” Journal of Personality and Social Psychology 57 (1989): 189–200.
11. I. Ritov and J. Baron, “Reluctance to Vaccinate: Omission Bias and Ambiguity,” Journal of Behavioral Decision Making 3 (1990): 263–277.
12. For further details on many of these issues, interested readers may consult:
S. Worcheland and W.G. Austin, Psychology of Intergroup Relations (Chicago: Nelson-Hill, 1986).
13. M.B. Brewer, “In-Group Bias in the Minimal Intergroup Situation: A Cognitive-Motivational Analysis,” Psychological Bulletin 86 (1979): 307–324.
14. For example, see:
S.E. Taylor, Positive Illusions (New York: Basic Books, 1989).
15. S.E. Taylor and J.D. Brown, “Illusion and Well-Being: A Social Psychological Perspective,” Psychological Bulletin 103 (1988): 193–210.
16. R.M. Kramer, E. Newton, and P.L. Pommerenke, “Self-Enhancement Biases and Negotiator Judgment: Effects of Self-Esteem and Mood,” Organizational Behavior and Human Decision Processes 56 (1993): 110–133.
17. M. Ross and F. Sicoly, “Egocentric Biases in Availability and Attribution,” Journal of Personality and Social Psychology 37 (1979): 322–336.
18. S. Lichtenstein, B. Fischhoff, and L.D. Phillips, “Calibration of Probabilities,” in D. Kahneman, P. Slovic, and A. Tversky, eds., Judgement under Uncertainty: Heuristics and Biases (Cambridge: Cambridge University Press, 1982), pp. 306–334.
19. B. Fischoff, P. Slovic, and S. Lichtenstein, “Knowing with Certainty: The Appropriateness of Extreme Confidence,” Journal of Experimental Psychology: Human Perception and Performance 3 (1977): 552–564.
21. R.M. Cambridge and R.C. Shreckengost, “Are You Sure? The Subjective Probability Assessment Test” (Langley, Virginia: Office of Training, Central Intelligence Agency, unpublished manuscript, 1980).
22. P.C. Wason, “On the Failure to Eliminate Hypotheses in a Conceptual Task,” Quarterly Journal of Experimental Psychology 12 (1960): 129–140.
23. Janis (1982).
24. S. Bok, Lying: Moral Choice in Public and Private Life (New York: Vintage Books, 1989).