I. Anchoring - Let me ask you two quick questions:
- Do you think that the population of Malaysia is greater than 100 million?
- What is your best estimate of the population of California?
If you're like most people, the answer to the second question was influenced by the "information" in the first question. This is an example of the anchoring trap -- our tendency to give a disproportionate weight to the first information we receive. When do you fall into this trap? How about when creating forecasts? Do you rely too heavily on past data? The trap is also seen frequently in negotiations (opening positions), during the hiring process (first impressions last), and during performance evaluations (let me review last year's evaluation first). Leaders who answer today’s challenges with yesterday’s solutions also fall into this trap. (By the way, the population of Malaysia is 25,274,132 and California is 36,756,666 :-)
II. Status Quo - Please complete this sentence, “a body in motion tends to..." if you answered, "stay in motion" you get a gold star. This physics law is also at work when we make decisions. We have a tendency to stick to the status quo -- to leave well enough alone and go with the flow. When making decisions, this means that leaders feel safer not to trying something new. That's because they have been conditioned by their organization’s culture not to rock the boat. It’s less risky to do what is conventional in most organizations. As Professor Hammond states, "Sins of commission (doing something) tend to be punished much more severely than sins of omission (doing nothing).” (1)
III. Sunk Costs - Have you ever continued to fund a project that should have been canceled long ago? Did you ever spend too much time trying to improve the performance of an employee that you should have fired earlier? Have you heard about bankers who continued to lend money to a failing business? These scenarios are all examples of the sunk costs trap -- "the tendency to continue an endeavor once an investment in money, effort, or time has been made." (2) The reason leaders continue to “throw good money after bad” is that they are reluctant to admit errors to themselves or to others.
IV. Framing Trap - The framing trap can be illustrated by asking you a few questions:
Would you accept a 50-50 chance of either losing $300 or winning $500?
What if, instead, I asked you this question:
Would you prefer to keep your checking account balance of $2,000 or to accept a 50-50 chance of having either $1,700 or $2,500 in your account?
If you actually had $2,000 in your checking account, these two questions pose the same problem and risk. From a rational perspective, your decision should be the same. However, numerous studies have shown that many people would decided to refuse the 50-50 chance in the first question, but accept it in the second. This is because of their different reference points (i.e., frame). The first question emphasizes absolute gains and losses, which triggers the thought of losing money. The second question, with its reference point of $2,000, frames the decision in a different perspective by emphasizing the relatively minor financial impact of losing money when you already have $2,000.
You also see the same principle apply in this stem cell debates. The pro-life politicians frame stem cell research as "murder" because it destroys human embryos. Stem cell advocates fight back by framing the research as our best hope of attacking debilitating diseases, affecting millions of Americans. Advocates know that it is difficult to call an influential spokesperson like actor Michael J. Fox a “murderer.”
V. False Assumptions - Professors Robert Cross and Susan Brodt tell the story of a Fortune 100 company that made a major investment to manufacture and distribute a core product in Asia. (3) They reported that the project’s champion knew very little about Asia, but was convinced he could succeed there just as he had in the United States. He held fast to his assumptions despite financial, operational, and strategic information that contradicted his views. After the fiasco, the project manager and senior executives realized that they had made a bad decision because of false assumptions. How often do you ever do that?
VI. Missed Signals - In the Mid-1990s, one of England's oldest merchant banks was bankrupted by $1 billion of unauthorized trading losses. A federal report on the collapse of this bank concluded that “a number of warning signs were present, but that individuals in a number of different departments failed to face up to, or follow up on, identified problems.” (4) Based on the current economic situation, many financial institutions have not paid attention to these signals. How often do you miss the signals?
VII. Competition Trap - The emotional urge to win during a competitive challenge often leads to costly decision errors. Boston Scientific fell into this trap during its acquisition of the medical device maker Guidant. As you may recall, Johnson & Johnson (J&J) announced plans to acquire Guidant late 2004. Soon thereafter, J&J threatened to pull out and lowered their purchase price offer because of Guidant’s pacemaker recall. That's when Boston Scientific -- J&J's rival -- offered to buy Guidant. The bidding war was on. Eventually, Boston Scientifics’ final and “winning” offer of $27.2 billion was $1.8 billion more than J&J's initial bid. Most financial analysts believe that this was a disastrous acquisition by Boston Scientific. According to an article in the Harvard Business Review, the emotion of winning overrode sound decision-making. (5) Does this sound familiar to you?
Just like a sand trap in golf, these decision traps are hazards to be avoided. Which ones cause the biggest problem for you?
Keep on eXpanding,
- John S. Hammond, Ralph L. Keeney, and Howard Raiffa; The Hidden Traps in Decision-Making, Harvard Business Review, January 2006, pages 118 -- 126.
- Itmar Simonson and Peter Nye, The Effect of Accountability on Susceptibility to Decision Errors, Organizational Behavior and Human Decision Processes, 51, 416 -- 446, 1992.
- How Assumptions of Consensus Undermine Decision-Making, MIT Sloan Management Review, Winter 2001, pages 86 -- 94.
- Information Failures and Organizational Disasters, MIT Sloan Management Review, Spring 2005, pages 8 -- 10.
- D. Malhotra, G. Ku, and J. Murnighan, Harvard Business Review, May 2008, 78 -- 86.