rThe Art of Critical Decision Making Part I Professor Michael A. Roberto THE TEACHING COMPANY ® The Art of Critical Decision Making Part II Professor Michael A. Roberto THE TEACHING COMPANY ® Michael A. Roberto, D. B. A. Trustee Professor of Management, Bryant University Michael A. Roberto is the Trustee Professor of Management at Bryant University in Smithfield, Rhode Island, where he teaches leadership, managerial decision making, and business strategy.

He joined the tenured faculty at Bryant after serving for six years on the faculty at Harvard Business School. He also has been a Visiting Associate Professor at New York University’s Stern School of Business. Professor Roberto’s new book, Know What You Don’t Know: How Great Leaders Prevent Problems before They Happen, was published by Wharton School Publishing in 2009. It examines how leaders discover hidden problems and unearth bad news in their organizations before such problems escalate to become major failures.

His 2005 book, Why Great Leaders Don’t Take Yes for an Answer, was named one of the top-10 business books of that year by The Globe and Mail, Canada’s largest daily newspaper. The book examines how leaders can cultivate constructive debate to make better decisions. Professor Roberto’s research focuses on strategic decision-making processes and senior management teams. He also has studied why catastrophic group or organizational failures happen, such as the Columbia space shuttle accident and the 1996 Mount Everest tragedy.

He has published articles based on his research in Harvard Business Review, California Management Review, MIT Sloan Management Review, The Leadership Quarterly, and Group and Organization Management. Professor Roberto’s research and teaching have earned several major awards. His 2004 article, “Strategic Decision-Making Processes: Beyond the Efficiency-Consensus Tradeoff,” was selected by Emerald Management Reviews as one of the top-50 management articles of 2004 from among 20,000 articles reviewed by that organization that year.

His multimedia case study about the 2003 space shuttle accident, titled “Columbia’s Final Mission,” earned the software industry’s prestigious Codie Award in 2006 for Best Postsecondary Education Instructional/Curriculum Solution. Finally, an article based on his research earned him the Robert Litschert Best Doctoral Student Paper Award in the year 2000 in the Academy of Management’s Business Policy Division. On the teaching front, Professor Roberto earned the Outstanding MBA Teaching Award at Bryant University in 2008. He also has won Harvard’s Allyn A. Young Prize for Teaching in Economics on two occasions.

Professor Roberto has taught in the leadership-development programs of and consulted at a number of firms including Apple, Morgan Stanley, Coca-Cola, Target, Mars, Wal-Mart, Novartis, The Home Depot, Federal Express, Johnson & Johnson, Bank of New York Mellon, and Edwards Life Sciences. He also has presented at government organizations including the FBI, NASA, and the EPA. Over the past five years, Professor Roberto has served on the faculty at the Nomura School of Advanced Management in Tokyo, where he teaches in an executive education program each summer. Professor Roberto received an A.

B. with Honors from Harvard College in 1991. He earned an M. B. A. with High Distinction from Harvard Business School in 1995, graduating as a George F. Baker Scholar. He also received his D. B. A. from the Harvard Business School in 2000. In the past, Professor Roberto worked as a financial analyst at General Dynamics, where he evaluated the firm’s performance on nuclear submarine programs. He also worked as a project manager at Staples, where he played a role in the firm’s acquisition integration efforts. In his spare time, Professor Roberto enjoys gardening, running, hiking, and cooking.

He lives in Holliston, Massachusetts, with his wife, Kristin, and his three children, Grace, Celia, and Luke. ©2009 The Teaching Company. i Table of Contents The Art of Critical Decision Making Professor Biography ................................................................................................................................................................... i Course Scope .............................................................................................................................................................................. Lecture One Making High-Stakes Decisions ............................................................................ 2 Lecture Two Cognitive Biases................................................................................................... 5 Lecture Three Avoiding Decision-Making Traps........................................................................ 8 Lecture Four Framing—Risk or Opportunity? ........................................................................ 10 Lecture Five Intuition—Recognizing Patterns ........................................................................ 2 Lecture Six Reasoning by Analogy ....................................................................................... 15 Lecture Seven Making Sense of Ambiguous Situations ............................................................ 17 Lecture Eight The Wisdom of Crowds?.................................................................................... 20 Lecture Nine Groupthink—Thinking or Conforming? ............................................................ 23 Lecture Ten Deciding How to Decide .................................................................................... 5 Lecture Eleven Stimulating Conflict and Debate ........................................................................ 27 Lecture Twelve Keeping Conflict Constructive ........................................................................... 29 Creativity and Brainstorming ............................................................................. 31 Lecture Thirteen Lecture Fourteen The Curious Inability to Decide ......................................................................... 33 Lecture Fifteen Procedural Justice............................................................................................... 5 Achieving Closure through Small Wins............................................................. 38 Lecture Sixteen Lecture Seventeen Normal Accident Theory.................................................................................... 40 Lecture Eighteen Normalizing Deviance........................................................................................ 42 Allison’s Model—Three Lenses......................................................................... 44 Lecture Nineteen Lecture Twenty Practical Drift ..................................................................................................... 6 Lecture Twenty-One Ambiguous Threats and the Recovery Window................................................. 49 Lecture Twenty-Two Connecting the Dots ........................................................................................... 51 Seeking Out Problems ........................................................................................ 53 Lecture Twenty-Three Lecture Twenty-Four Asking the Right Questions................................................................................ 5 Glossary .................................................................................................................................................................................... 56 Biographical Notes................................................................................................................................................................... 58 Bibliography............................................................................................................................................................................. 0 ii ©2009 The Teaching Company. The Art of Critical Decision Making Scope: Why did that leader make such a horrible decision? We have all asked that question when we have observed a poor decision, whether it be in politics, business, athletics, or the nonprofit sector. Too often, observers attribute such flawed choices to incompetence, inexperience, a lack of intelligence, or bad intentions. In most cases, though, the faulty decisions do not arise because of these factors.

In this course, we examine why leaders and organizations make poor choices, digging deep into cognitive psychology, group dynamics, and theories of organizational culture and systems to help us understand why well-intentioned, capable people blunder. Moreover, we examine the techniques and behaviors that leaders can employ to improve decision making in their organization. We focus on how leaders can design decision-making processes that marshal the collective intellect in their organizations, bringing together the diverse expertise, perspectives, and talents to determine the best course of action.

The course uses case studies to examine decision making at three levels: individual, group, and organizational. To begin, we examine how individuals make choices. We show that most individuals do not examine every possible alternative or collect mountains of information and data when making choices. Instead, most of us draw on our experience, apply rules of thumb, and use other heuristics when making decisions. Sometimes, that leads us into trouble. As it turns out, most individuals are susceptible to what psychologists call cognitive biases—decision traps that cause us to make certain systematic mistakes when making choices.

From there, we examine the intuitive process in great depth, showing that intuition is more than a gut instinct. Intuition represents a powerful pattern-recognition capability that individuals have, drawing from their wealth of past experience. However, intuition can lead us astray, and this course explains how and why that can happen, particularly when we reason by analogy. In the second major module of the course, we examine how teams make decisions, recognizing that most of us do not make all our choices on our own. Instead, we often work in groups to make complex choices.

We begin by asking the question, are groups “smarter” than individuals? We see that they can be, but in many cases, teams do not actually employ the diverse talents and knowledge of the members effectively. Thus, teams may experience a lack of synergy among the members. We show the problems that typically arise, such as groupthink—that is, the tendency for groups to experience powerful pressures for conformity, which suppress dissenting views and lead to clouded judgments. In our section on group decision making, we also examine why teams often find themselves riddled with indecision.

Most importantly, though, we examine how groups can stimulate constructive conflict, as well as achieve consensus and timely closure, so that they can overcome these problems and make better decisions. Finally, we examine decision making at the organizational level of analysis. Here, we look at a number of large-scale failures such as the Columbia space shuttle accident and the Three Mile Island nuclear power plant incident. We show that one cannot attribute such failures to one faulty decision, nor to one poor leader.

Instead, we must understand how the structure, systems, and culture of the organization shape the behavior of many individuals and teams. In these cases, we often see largescale failures resulting from multiple small decision failures that form a chain of events leading to a catastrophe. We also look in this final module at how some organizations have discovered ways to encourage vigilant decision making in the face of high risks, such that they perform with remarkable reliability. The course concludes with a lecture on how leaders must behave differently to improve decision making in their organizations.

Specifically, leaders have to dispel the notion that they must come up with all the answers or solutions to tough problems. Instead, they must view their responsibility as designing the decision-making processes that help individuals come together to make better choices. Leaders have to create productive dialogues in their organizations, and to do so, they have to understand the pitfalls that are described in this course, as well as the techniques that can be used to enhance decision-making effectiveness. Before applying these lessons, leaders must first learn to identify the true problems facing their organizations.

By honing their skills as problem finders, leaders at all levels can preempt threats before they balloon into disasters. ©2009 The Teaching Company. 1 Lecture One Making High-Stakes Decisions Scope: Why did President John F. Kennedy choose to support an invasion of the Bay of Pigs by Cuban exiles in 1961? Why did NASA choose to launch the Challenger in 1986 despite engineers’ concerns about O-ring failure? In each of these cases, leaders—and the organizations in which they worked—made flawed decisions that led to very poor outcomes.

When we think about these types of blunders, we carry with us certain myths about how leaders make decisions. We jump too quickly to the conclusion that the leadership must have had poor intentions or lacked the competence to make the right call. This lecture seeks to identify and dispel those myths about leadership and decision making. It explains how decisions actually get made in most organizations, as well as why they tend to go off track. We make the argument that failings at the individual, group, and organizational levels tend to contribute to poor decision making. Outline I.

Decision making is one of the most essential skills that a leader must possess. In this course, we will look at how leaders can improve their ability to make high-stakes decisions in organizations. A. We have all witnessed or read about spectacular decision-making blunders. 1. Why did John F. Kennedy decide to support the Bay of Pigs invasion by a group of Cuban exiles intent on overthrowing communist dictator Fidel Castro? 2. Why did NASA decide to launch the Challenger space shuttle in 1986 despite engineers’ concerns about possible O-ring erosion due to the cold temperatures expected on the morning of the launch? . Why did Coca-Cola CEO Roberto Goizueta decide to introduce New Coke in 1985, changing the vaunted formula on the company’s flagship drink? B. When we observe such highly flawed decision making, we often ask ourselves, how could they have been so stupid? 1. We often attribute others’ decision-making failures to a lack of intelligence or relevant expertise, or even to personality flaws of the individuals involved. We might even question their motives. 2. We think of our own decision-making failures in a different way.

We tend to blame an unforeseeable change in external factors; we don’t attribute it to factors within ourselves such as intelligence, personality, or expertise. Psychologists describe this dichotomy as the fundamental attribution error. 3. Perhaps we think of others’ failures as the blunders of unintelligent or incapable individuals because we want to convince ourselves that we can succeed at a similar endeavor despite the obvious risks. 4. In most cases, differences in intellectual capability simply do not help us differentiate success from failure when it comes to complex, high-stakes decisions. . As it turns out, most leaders stumble when it comes to the social, emotional, and political dynamics of decision making. They also make mistakes because of certain cognitive traps that affect all of us, regardless of our intellect or expertise in a particular field. II. We maintain a belief in a number of myths about how decisions are made in groups and organizations. By clearly understanding how decisions are actually made in organizations, we can begin to learn how to improve our decisionmaking capabilities. A. Myth #1: The chief executive decides. . Reality: Strategic decision making entails simultaneous activity by people at multiple levels of the organization. 2. We can’t look only to the chief executive to understand why a company or nonprofit organization or school embarked on a particular course of action. B. Myth #2: Decisions are made in the room. 1. Reality: Much of the real work occurs “off-line,” in one-on-one conversations or small subgroups, not around a conference table. 2. The purpose of formal staff meetings is often simply to ratify decisions that have already been made. C.

Myth #3: Decisions are largely intellectual exercises. 1. Reality: High-stakes decisions are complex social, emotional, and political processes. 2. Social pressures for conformity and human beings’ natural desire for belonging affect and distort our decision making. 3. Emotions can either motivate us or at times paralyze us when we make important decisions. 4. Political behaviors such as coalition building, lobbying, and bargaining play an important role in organizational decision making. 2 ©2009 The Teaching Company. D. Myth #4: Managers analyze and then decide. 1.

Reality: Strategic decisions unfold in a nonlinear fashion, with solutions frequently arising before managers define problems or analyze alternatives. 2. Decision-making processes rarely flow in a linear sequence, as many classic stage models suggest. 3. Sometimes, solutions go in search of problems to solve. 4. In my research, I found a number of managers who chose a course of action and then engaged their team to conduct analysis of various alternatives. They do so for a number of reasons. 5. Consider the case of Lee Iacocca and the Ford Mustang. Iacocca conducted a great deal of analysis as a tool of persuasion, not of decision making.

E. Myth #5: Managers decide and then act. 1. Reality: Strategic decisions often evolve over time and proceed through an iterative process of choice and action. 2. We often take some actions, make sense of those actions, and then make some decisions about how we want to move forward. III. To understand how decisions occur, and what can go wrong when we make critical choices, we have to understand decision making at three levels of analysis: individual, group, and organizational. A. At the individual level, we have to understand how the mind works. Sometimes, our mind plays tricks on us. Sometimes, we make biased judgments.

On other occasions, our intuition proves quite accurate. 1. We make poor decisions because of cognitive biases such as overconfidence and the sunk-cost effect. 2. Our intuition can be very powerful, but at times, we make mistakes as we match what we are seeing to patterns from our past. B. At the group level, we have to understand why teams do not always make better decisions than individuals. 1. Groups hold great promise, because we can pool the intellect, expertise, and perspectives of many people. That diversity holds the potential to enable better decisions than any particular individual could make. 2.

Unfortunately, many groups do not realize that potential. They fail to realize the synergy among their members. In fact, they make decisions that are inferior to those that the best individual within the group could make on his or her own. 3. To understand group decision-making failures, we have to examine problems that groups encounter such as social pressures for conformity. C. At the organizational level, we have to understand how structure, systems, and culture shape the decisions that we make. 1. We do not make our decisions in a vacuum. Our environment shapes how we think, how we interact with those around us, and how we make udgments. 2. Organizational forces can distort the information that we receive, the interpretations of those data, and the way that communication takes place (or does not take place) among people with relevant expertise. IV. Many leaders fail because they think of decisions as events, not processes. A. We think of the decision maker sitting alone at a moment in time, pondering what choice to make. 1. However, most decisions involve a series of events and interactions that unfold over time. 2. Decisions involve processes that take place inside the minds of individuals, within groups, and across units of complex organizations. B.

Many leaders focus on finding the right solutions to problems rather than thinking carefully about what process they should employ to make key decisions. 1. When confronted with a tough issue, we focus on the question, what decision should I make? 2. We should first ask, how I should I go about making this decision? C. The purpose of this course is to help us understand how to diagnose our processes of decision making, as well as how to enhance those processes moving forward. V. As we go through this course, we will draw heavily on the case method. A. A case simply involves a thick, rich description of a series of actual events.

B. In many lectures, we will dive right into a case study to begin our discussion of a particular topic. From that case, we will induce a number of key concepts and frameworks. C. We also will work deductively at times, starting with theory and then using case studies to illustrate important theories of decision making so as to bring those theories to life. D. Over time, we will learn by comparing and contrasting case studies as well. ©2009 The Teaching Company. 3 E. Research shows that people learn key ideas more effectively when they can attach those concepts to real-world examples. F.

We hope that the cases will make an indelible imprint, so that you remember the concepts and ideas that we discuss, and so that you will have a deeper understanding of them. Suggested Reading: Harrison, The Managerial Decision-Making Process. Roberto, Why Great Leaders Don’t Take Yes for an Answer. Questions to Consider: 1. Why do we often hold a distorted view of how decisions actually take place in organizations? 2. Why do we often focus more on the content of a decision than on the process of decision making? 3. What is the value of learning by the case method? 4 ©2009 The Teaching Company.

Lecture Two Cognitive Biases Scope: Drawing on the case study of Mount Everest, we explain how human beings tend to make certain types of classic mistakes when we make decisions. We call these mistakes cognitive biases. These biases tend to affect both novices and experts across a wide range of fields. The biases exist because we are not perfectly rational human beings, in the sense of an economist’s rational choice model of decision making. Instead, we are fundamentally bounded in our rationality; that is, we do not examine every possible option or every scrap of data before we make a decision.

We adopt certain rules of thumb and take other shortcuts when we make choices. By and large, those shortcuts help us make choices in an economical manner, so that we do not get bogged down every time we need to make a decision. However, in some cases, our cognitive limitations lead to poor decisions. In this lecture, the Mount Everest case study illustrates biases such as the sunk-cost effect, overconfidence bias, and recency effect. Outline I. One of the most powerful examples of flawed decision making is the 1996 Mount Everest tragedy. A.

The tragedy occurred when 2 expedition teams got caught in a storm, high on the mountain, on May 10–11, 1996. Both expedition team leaders, as well as 3 team members, died during the storm. 1. The 2 teams were commercial expeditions, meaning that individuals were clients paying to be guided to the top by a professional mountaineer. 2. Scott Fischer led the Mountain Madness team. Rob Hall led the Adventure Consultants expedition. B. Climbing Mount Everest is an incredibly arduous exercise. 1. It takes roughly 2 months to climb Everest, because you must pend at least 6 weeks preparing your body for the final push to the summit. 2. During those 6 weeks, you go through an acclimatization routine to allow your body to adjust to the low levels of oxygen at high altitude. 3. During that time, you establish a series of camps along the path to the summit, starting with Base Camp, which is at about 17,000 feet. The summit is at over 29,000 feet (well over 8000 meters). 4. The final push to the summit entails an 18-hour round trip from Camp IV to the summit. You leave late at night and climb through the dark to the summit, reaching it around midday if all goes well.

Then you climb down quickly so that you can reach Camp IV again before it gets dark. 5. Supplemental oxygen is critical for most climbers. Even with it, the climbing can be very difficult. As mountaineer David Breashears has said, it can be like “running on a treadmill while breathing through a straw. ” C. The 2 expedition teams that encountered trouble on May 10, 1996, violated some of their own rules for climbing. 1. The expedition leaders talked extensively about the need for a turnaround-time rule. The principle was that, if you could not reach the top by one or two o’clock in the afternoon, then you should turn around.

The reason is that you do not want to be climbing down in the darkness. 2. On May 10–11, 1996, many of the expedition team members did not reach the summit until late in the afternoon. Some arrived at or after four o’clock. Jon Krakauer, one of the climbers, who wrote a bestselling book about the incident, has written that “turnaround times were egregiously ignored. ” 3. As a result, they were climbing high on the mountain at a far later hour than they should have been. 4. When the storm hit, they found themselves not only trying to climb down in darkness, but also during a raging blizzard. 5.

Five people could not get back to Camp IV, and they died high on the mountain. II. The Mount Everest case illustrates a number of cognitive biases that impaired the climbers’ decision making. A. We are not perfectly rational actors. 1. Economists depict individuals as rational decision makers. By that, they mean that individuals collect lots of information, examine a wide variety of alternatives, and then make decisions that maximize our personal satisfaction. 2. However, we do not make decisions in a manner consistent with economic models. Nobel Prize–winner Herbert Simon has argued that humans are boundedly rational.

We are cognitively limited, such that we can’t possibly be as comprehensive in our information gathering and analysis as economists assume. 3. Herbert Simon and James March have argued that humans satisfice, rather than optimize in the way that economic theory presumes. By satisficing, they mean that we search for alternatives only to the point where we find an acceptable solution. We do not keep looking for the perfectly optimal solution. 4. In many situations, we take shortcuts. We employ heuristics and rules of thumb to make decisions. ©2009 The Teaching Company. 5 Most of the time, our shortcuts serve us well.

They save us a great deal of time, and we still arrive at a good decision. B. Sometimes, though, we make mistakes. Our cognitive limitations lead to errors in judgment, not because of a lack of intelligence, but simply because we are human. 1. Psychologists describe these systematic mistakes as cognitive biases. Think of these as decision-making traps that we fall into over and over. 2. These biases affect experts as well as novices. 3. They have been shown to affect people in a wide variety of fields. Psychologists have demonstrated the existence of these biases in experimental settings as well as in field research.

III. The first cognitive bias evident in the Everest case is the overconfidence bias. A. Psychologists have shown that human beings are systematically overconfident in our judgments. B. For instance, research shows that physicians are overly optimistic in their diagnoses, even if they have a great deal of experience. C. In the Everest case, the expedition leaders clearly displayed evidence of overconfidence bias. D. Scott Fischer once said, “We’ve got the Big E completely figured out, we’ve got it totally wired. These days, I’m telling you, we’ve built a yellow brick road to the summit. E. When one climber worried about the team’s ability to reach the summit, Rob Hall said, “It’s worked 39 times so far, pal, and a few of the blokes who summitted with me were nearly as pathetic as you. ” F. Many of the climbers had arrived at very positive self-assessments. Krakauer described them as “clinically delusional. ” IV. The second cognitive bias is the sunk-cost effect. A. The sunk-cost effect refers to the tendency for people to escalate commitment to a course of action in which they have made substantial prior investments of time, money, or other resources. . If people behaved rationally, they would make choices based on the marginal costs and benefits of their actions. They would ignore sunk costs. 2. In the face of high sunk costs, people become overly committed to certain activities even if the results are quite poor. They “throw good money after bad,” and the situation continues to escalate. 3. Barry Staw was one of the first researchers to demonstrate the sunk-cost effect in an experimental study. 4. Then in 1995, Staw and his colleague Ha Hoang studied the issue in a real-world setting.

Their study demonstrated evidence of the sunk-cost effect in the way that management and coaches made decisions in the National Basketball Association. B. In the Everest case, the climbers did not want to “waste” the time, money, and other resources that they had spent over many months to prepare for the final summit push. 1. They had spent $65,000 plus many months of training and preparing. The sunk costs were substantial. 2. Thus, they violated the turnaround-time rule, and they kept climbing even in the face of evidence that things could turn out quite badly.

Some have described it as “summit fever,” when you are so close to the top and just can’t turn back. 3. At one point, climber Doug Hansen said, “I’ve put too much of myself into this mountain to quit now, without giving it everything I’ve got. ” 4. Guide Guy Cotter has said, “It’s very difficult to turn someone around high on the mountain. If a client sees that the summit is close and they’re dead set on getting there, they’re going to laugh in your face and keep going. ” V. The third cognitive bias evident in the Everest case is the recency effect. A. The recency effect is actually one particular form of what is called the availability bias. . The availability bias is when we tend to place too much emphasis on the information and evidence that is most readily available to us when we are making a decision. 2. The recency effect is when we place too much emphasis on recent events, which of course are quite salient to us. 3. In one study of decision making by chemical engineers, scholars showed how they misdiagnosed product failures because they tended to focus too heavily on causes that they had experienced recently. B. In the case of Everest, climbers were fooled because the weather had been quite good in recent years on the mountain. 1. Therefore, limbers underestimated the probability of a bad storm. 2. David Breashears said, “Several seasons of good weather have led people to think of Everest as benevolent, but in the mid-eighties—before many of the guides had been on Everest—there were three consecutive seasons when no one climbed the mountain because of the ferocious wind. ” 5. 6 ©2009 The Teaching Company. He also said, “Season after season, Rob had brilliant weather on summit day. He’d never been caught in a storm high on the mountain. ” 4. We all can get fooled by recent hot streaks. We begin to get caught up in a streak of success and underestimate the probability of failure.

If we looked back over the entire history of a particular matter, we would raise our probability of failure. C. In the lecture that follows, we will examine a number of other biases that affect decision makers. Suggested Reading: Krakauer, Into Thin Air. Russo and Schoemaker, Winning Decisions. Questions to Consider: 1. What are the costs and benefits of satisficing (relative to the optimization process depicted by economists)? 2. Why do humans find it so difficult to ignore sunk costs? 3. What are some examples of “summit fever”–type behavior in other fields? 3. ©2009 The Teaching Company. 7

Lecture Three Avoiding Decision-Making Traps Scope: This lecture continues our discussion of cognitive biases. Drawing on a number of examples ranging from the National Basketball Association to the Pearl Harbor attacks, we examine a range of cognitive biases that can lead to faulty decision making. These biases include the confirmatory bias, anchoring bias, attribution error, illusory correlation, hindsight bias, and egocentrism. We also discuss how one combats such biases. Raising awareness of these potential traps certainly can help individuals improve their decision making, but awareness alone will not protect us from failure.

We discuss how effective groups can help counter the failings of individuals, a topic we examine in further depth in the next module of the course. Outline I. Many other cognitive biases exist. We will focus on a few more of them in this lecture: first, and in the most depth, on the confirmation bias. This is one of the most prevalent biases that we face each day. A. The confirmation bias refers to our tendency to gather and rely on information that confirms our existing views and to avoid or downplay information that disconfirms our preexisting hypotheses. 1.

As Roberta Wohlstetter described in her study of the Pearl Harbor attacks, decision makers often exhibit a “stubborn attachment to existing beliefs. ” 2. One experimental study showed that we assimilate data in a biased manner because of the confirmatory bias. 3. The study examined people’s attitudes toward the death penalty and examined how individuals reacted to data in support of, as well as against, their preexisting point of view on the issue. 4. That study showed that the biased assimilation of data actually led to a polarization of views within a group of people after they looked at studies regarding the death penalty.

B. NASA’s behavior with regard to the Columbia shuttle accident in 2003 shows evidence of the confirmation bias. 1. There was clearly an attachment to existing beliefs that the foam did not pose a safety threat to the shuttle. 2. The same managers who signed off on the shuttle launch at the flight readiness review, despite evidence of past foam strikes, were responsible for then judging whether the foam strike on Columbia was a safety of flight risk. 3. It’s very difficult for those people to detach themselves from their existing beliefs, which they pronounced publicly at the flight readiness review. 4.

Each safe return of the shuttle, despite past foam strikes, confirmed those existing beliefs. 5. NASA also showed evidence of not seeking disconfirming data. 6. They did not maintain launch cameras properly. 7. The mission manager also repeatedly sought the advice of an expert whom everyone knew believed foam strikes were not dangerous, while not speaking directly with those who were gravely concerned. II. The anchoring bias refers to the notion that we sometimes allow an initial reference point to distort our estimates. We begin at the reference point and then adjust from there, even if the initial reference point is completely arbitrary.

A. Scholars Amos Tversky and Daniel Kahneman demonstrated this with an interesting experiment. 1. They asked people to guess the percentage of African nations that were United Nations members. 2. They asked some if the percentage was more or less than 45% and others whether it was more or less than 65%. 3. The former group estimated a lower percentage than the latter. The scholars argued that the initial reference points served as anchors. B. This bias can affect a wide variety of real-world decisions. 1. Some have argued that people can even use anchoring bias to their advantage, such as in a negotiation.

Starting at an extreme position may serve as an anchor, and the other side may find itself adjusting from that initial arbitrary reference point. 2. Think of buying a car. The manufacturer’s suggested retail price often serves an anchor, or certainly the dealer would like it to serve as anchor. They would like you to adjust off of that number. 3. Some have said that anchoring bias requires the use of unbiased outside experts at times. For instance, does a Wall Street analyst anchor to the prior rating on a stock and therefore not offer as accurate a judgment as someone new to the job of evaluating that company’s financial performance?

III. There are a number of other biases that psychologists have identified. A. Illusory correlation refers to the fact that we sometimes jump to conclusions about the relationship between 2 variables when no relationship exists. 1. Illusory correlation explains why stereotypes often form and persist. 2. One very powerful experience can certainly feed into illusory correlation. 8 ©2009 The Teaching Company. Sometimes, odd things happen that show correlation for quite some time, but we have to be careful not to conclude that there are cause-effect relationships.

There have been links made between Super Bowl winners and stock market performance, or between the Washington Redskins’ performance and election results. B. Hindsight bias refers to the fact that we look back at past events and judge them as easily predictable when they clearly were not as easily foreseen. C. Egocentrism is when we attribute more credit to ourselves for a particular group or collective outcome than an outside party would attribute. IV. How can we combat cognitive biases in our decision making? A. We can begin by becoming more aware of these biases and then making others with whom we work and collaborate more aware of them.

B. We also can review our past work to determine if we have been particularly vulnerable to some of these biases. After-action reviews can be powerful learning moments. C. Making sure that you get rapid feedback on your decisions is also important, so as to not repeat mistakes. D. Tapping into unbiased experts can also be very helpful. E. Effective group dynamics can certainly help to combat cognitive biases. A group that engages in candid dialogue and vigorous debate may be less likely to be victimized by cognitive biases.

We will discuss this more in the next module of the course on group decision making. F. Overall, though, we should note that these biases are rooted in human nature. They are tough to avoid. Suggested Reading: Bazerman, Judgment in Managerial Decision Making. Wohlstetter, Pearl Harbor. Questions to Consider: 1. How does confirmation bias contribute to polarization of attitudes? 2. Why is awareness alone not sufficient to combat cognitive biases? 3. What are some examples of confirmation bias that have affected your decision making? 3. ©2009 The Teaching Company. 9

Lecture Four Framing—Risk or Opportunity? Scope: Drawing on case studies of the September 11 attacks, the automobile and newspaper industries, and the Vietnam War, we discuss the concept of framing. Frames are mental structures—tacit beliefs and assumptions—that simplify people’s understanding of the world around them and help them make sense of it as they decide and act. For instance, many national security officials viewed the threats facing the United States at the start of this century through a cold war frame, even though we had moved well beyond that era by the time of the 9/11 attacks.

Frames can help us, because they enable us to deal with complexity without being overwhelmed by it. However, the frames we adopt also can be quite constricting. This lecture explains how powerful frames can be and how the way that a problem is framed can, in fact, drive the types of solutions that are considered. We examine the difference between framing something as a threat versus an opportunity, as well as how framing affects our propensity to take risks. We conclude by discussing how one can encourage the use of multiple frames to enhance decision-making effectiveness. Outline I.

Frames are mental models that we use to simplify our understanding of the complex world around us, to help us make sense of it. They involve our assumptions, often taken for granted, about how things work. How we frame a problem often shapes the solution at which we arrive. A. Economists believe that we estimate expected values when confronted with risky situations and that framing of the situation should not matter. 1. Economists would argue that we weight different possible outcomes with probabilities when faced with a risky situation and then determine what the expected value will be. . Most of us are slightly risk averse, meaning we would rather take an amount slightly less than the expected value, if given to us with certainty, rather than take the risk of a high or low outcome. 3. Economists do not believe that how we frame the situation should matter in terms of our decision making in risky situations. B. Prospect theory suggests that framing matters. Even small changes in wording have a substantial effect on our propensity to take risks. 1. According to prospect theory, framing does matter a great deal.

If we frame a situation in terms of a potential gain, we act differently than if we frame it in terms of a potential loss. 2. Amos Tversky and Daniel Kahneman argued that framing situations in terms of a loss causes us to take more risks. 3. In one famous experiment, they showed that we act differently if a decision is framed in terms of the probabilities that lives will be saved from a particular medical regimen versus in terms of deaths that will be prevented. 4. Their work shows that we make different decisions given alternative frames, even if the expected values in both situations are identical.

C. Prospect theory may be one explanation for the escalation of commitment that occurs when there are high sunk costs. 1. The Vietnam War was a tragic example of the escalation of commitment. We gradually kept increasing our involvement, despite poor results. 2. One could argue that we poured more resources into the war because we framed the situation in terms of a loss. Thus, we had a propensity to take more and more risk to try to avoid the loss. II. Management scholars have extended this early work by arguing that we act differently when situations are framed as opportunities versus threats.

A. According to this theory, organizations act very rigidly when faced with threats, and they act more much flexibly and adaptively if they frame those same situations as opportunities. 1. In particular, scholars have argued that we tend to simply “try harder” using well-established routines and procedures when we frame something as a threat. 2. However, we may not think differently, or find new ways of working effectively. We may be doing more of what got us in trouble in the first place. B.

More recent work suggests that framing a situation as a threat may be useful in that we do allocate more resources to the problem, but we need to frame it as an opportunity to use those resources effectively. 1. In other words, we need to balance these 2 competing frames. 10 ©2009 The Teaching Company. 2. 3. 4. 5. One study examined how the newspaper industry responded to the threat of the Internet. The study found that those who exclusively examined it as a threat were responding by pouring dollars at the Web. However, they tended to imply replicate their hard copy online; it wasn’t a creative use of the technology. Those who framed it as an opportunity did respond more adaptively, but they didn’t necessarily allocate sufficient resources to the situation. The most effective organizations initially assessed the threat, but then reframed the Web as an opportunity to do exciting new things. III. Framing is a general phenomenon, not simply about binary categories. A. We’ve always adopted mental models that shape our way of looking at situations. Sometimes, though, those mental models become outdated. 1.

In the case of the September 11 terrorist attacks, the 9/11 Commission found that many government agencies were still operating with a cold war frame at the time. 2. The cold war mind-set viewed threats as emanating primarily from nation-states. 3. The cold war mind-set emphasized conventional warfare and arming ourselves to protect against military attacks by the armies of other nations. 4. The various arms of the federal government were all still organized based on this cold war model of national security. They were not organized to defend against these so-called asymmetric threats.

B. Mental models ultimately come down to our taken-for-granted assumptions about how the world works. These assumptions can easily get outdated, and yet we don’t make them explicit and challenge them. 1. USC professor James O’Toole once identified the core assumptions of the management team at General Motors in the 1970s. 2. His analysis suggested that GM was unable to recognize how and when these assumptions had become outdated. 3. When the threat of Japanese imports arose, they first dismissed it. Then, having framed it as a threat, they acted very rigidly in response. IV.

What should individuals do about the fact that framing can have such a powerful effect on our decision making? A. First, leaders need to be careful about imposing their frame on their management team. In some situations, leaders may want to hold back on offering their assessment, because their framing of the situation may constrict the range of advice and alternatives brought forth by their team. B. We also should consider adopting multiple frames when we examine any particular situation. In other words, we ought to define our problems in several different ways, because each definition naturally tilts us toward one kind of solution.

C. Finally, we need to surface our implicit assumptions, and then probe and test those presumptions very carefully. Suggested Reading: Kahneman and Tversky, Choices, Values, and Frames. O’Toole, Leading Change. Questions to Consider: 1. How does framing of a situation shape the risks that we take and the amount of resources that we expend? 2. Why do we find it so difficult to shake old mental models? 3. How can we reframe situations to encourage more divergent thinking? ©2009 The Teaching Company. 11 Lecture Five Intuition—Recognizing Patterns Scope: What is intuition? How does it work?

What are the classic mistakes that we make when employing our intuition? Can one develop intuition? How do we combine rational analysis and intuition effectively? Drawing on case studies from healthcare, the military, firefighting, and the video-game industry, this lecture seeks to answer these questions. Intuition, fundamentally, represents an individual’s pattern-recognition abilities based on their past experience. When we use intuition, we do not go through a rational analysis of multiple alternatives, with deep evaluation of the consequences of each option. Yet our intuition often leads to good decisions.

This lecture explains how the intuitive process works, a process whose steps we often are not aware of as they unfold. As it turns out, intuition is more than simply a gut instinct. It involves powerful cognitive processes that draw on the wealth of experiences that we have stored in our brains. Of course, intuition can lead us astray in certain predictable ways, and we will explore those pitfalls as well. Outline I. What is intuition? How does it affect the way we make decisions? A. Intuition is fundamentally about pattern recognition and pattern matching based on our past experience.

Psychologist Gary Klein’s work has been informative on this matter. B. When we use our intuition, we do not evaluate a whole series of alternatives, as many decision-making models suggest that we should. C. Instead, we assess a situation, and we spot certain cues. D. From these cues, we recognize patterns based on our past experience. We match the current situation to these past patterns. E. As part of that pattern matching, we often reason by analogy to past situations that seem similar to the one we currently face. F. Based on that pattern recognition, we then embark on a course of action.

We adopt certain “scripts” from our past experience. G. We don’t explore a wide range of options; instead, we tend to mentally simulate our initial preferred action. We envision how it might play out. If it seems feasible, we go with it. If not, then we might explore other options. II. How does intuition work for actual decision makers facing challenging situations? A. Firefighters use intuition when determining how to fight a blaze. They often do assess a wide range of options, but they don’t have time to do so in many cases. 1. Klein gives an example of a firefighter who assessed a situation that appeared to be a simple kitchen fire. . However, certain cues (or features of the situation) did not match the pattern of experience that the firefighter had had with kitchen fires. 3. From that, he concluded that something did not seem right. This didn’t seem like an ordinary kitchen fire. 4. He ordered his men out of the building right away. The floor collapsed shortly thereafter. As it turned out, this fire was actually emanating from the basement. It was far more serious than a simple kitchen fire. B. Nurses and doctors use intuition all the time, despite all the data that you might think drive their decision making. . Here, you see a clear distinction between novices and experts. Novices don’t have the experience to engage in the pattern recognition that an expert can employ. 2. Nurses often report that they took action simply because they didn’t think things felt right. Something told them that the patient was in more trouble than the data suggested. 3. In one study, we examined a mechanism called rapid response teams in hospitals. 4. These teams were designed to pick up on early signs of a potential cardiac arrest and to trigger intervention to prevent such an outcome. . Nurses were given a set of quantitative criteria to look for in assessing patients at risk. They were also told to call the team if they simply felt uncomfortable about a situation. 6. Many hospitals reported that a substantial number of calls came when experienced nurses felt uncomfortable but the vital signs appeared relatively normal. 7. One hospital reported to us that nurse concern (without vital sign abnormalities) was the best predictor that intervention was required to prevent a bad outcome from unfolding. C.

In a case study on Electronic Arts, the leading video-game publisher, we found that intuition played a very large role in decision making. 12 ©2009 The Teaching Company. 1. 2. 3. 4. 5. 6. The leaders of the development process did not have a formal method for evaluating games under development. Instead, they relied on their intuition to determine whether a game appeared viable. They often drew parallels to past situations. The Electronic Arts case illustrates one of the challenges of organizations that rely heavily on intuitive decision making. The question there was, how do you pass on this wisdom to newer managers?

It’s hard to codify that knowledge. Hospitals face the same issue with nurses. How do you pass along that intuition? They find that much of it occurs through apprenticeship and the way that expert nurses communicate their thought process to novices. Thinking out loud turns out to be a key practice that expert nurses employ. Such behaviors work much more effectively than trying to write down intuitive wisdom. III. What are the dangers of intuition? How can it lead us astray? A. We are susceptible to cognitive biases, as described in the 2 prior lectures. B. Research has shown that we sometimes misuse analogies. . We do not make the right match to past situations in our experience. 2. We draw the wrong lessons from those analogous situations. 3. We will explore this important issue more in the next lecture. C. In highly complex, ambiguous situations, sometimes the complexity obscures our pattern-recognition ability. D. We sometimes have outdated mental models, particularly regarding cause-and-effect relationships. E. We fail to question well-established rules of thumb. For instance, many industries adopt simple rules of thumb; they become the conventional wisdom. However, they can become outdated. F.

Intuition can lead us astray when we move outside of our experience base. Then, the new situations don’t fit nicely with the past patterns we have seen. G. Finally, it’s very hard to communicate our intuitive judgments and choices. Thus, it can be hard to persuade others to commit to our intuitive decisions or to get them to understand how and why we made that choice. This can have a detrimental effect on decision implementation. IV. How can we communicate our intuition more effectively? A. Often, when a leader uses intuition, people misinterpret the leader’s intent, and therefore implementation suffers. 1.

Gary Klein has shown this in his research with military commanders. 2. The idea is that people need to understand your rationale and your intent, because in a large organization, they will then have to make their own decisions out in the field during the execution process. You want them to make decisions consistent with your original intent. 3. Klein works on exercises with military commanders where they try to issue orders with clear intent and then subordinates feed back to them what they perceive the intent to be. 4. Military commanders then learn how to clarify their explanations so as to make their thinking more transparent.

B. Organizational scholar Karl Weick has proposed a simple 5-step process for communicating intuitive decisions and garnering feedback so as to ensure clear understanding on the part of a team. 1. Here’s what I think we face. 2. Here’s what I think we should do. 3. Here’s why. 4. Here’s what we should keep our eye on. 5. Now, talk to me. V. Leaders should find ways to combine intuitive judgment with formal analysis. Here are a number of ways to effectively do so. A. Use analysis to check your intuition, but not simply to justify decisions that have already been made. B.

Use intuition to validate and test the assumptions that underlie your analysis. C. Use analysis to explore and evaluate intuitive doubts that emerge as your prepare to make a decision. D. Use the intuition of outside experts to probe the validity of your analysis. E. Use mental simulation (and premortem exercises) to enhance your analysis of alternatives. F. Do not try to replace intuition with rules and procedures. Suggested Reading: Benner, From Novice to Expert. Klein, Sources of Power. ©2009 The Teaching Company. 13 Questions to Consider: 1. What are the positive and negative effects of utilizing intuition to make key decisions? . How can we integrate intuition and analysis more effectively in our decision making? 3. What can we do to refine our pattern-recognition capabilities? 14 ©2009 The Teaching Company. Lecture Six Reasoning by Analogy Scope: Reasoning by analogy represents one powerful dimension of the intuitive process. This lecture explains how analogical reasoning works. Put simply, when we assess a situation, we often make references or analogies to past experiences. Often, these analogies prove very helpful to us as we try to make sense of an ambiguous and challenging problem or situation.

However, analogical reasoning can cause us to make flawed decisions as well, largely because we tend to overemphasize the similarities between 2 situations when we draw analogies. Moreover, we tend to underemphasize key differences, or ignore them altogether. Drawing on case studies such as the Korean War and business examples in industries such as beer and chocolate, we explain how and why analogies lead us astray, as well as how you can improve your analogical reasoning capabilities. Outline I. Whether making decisions intuitively or analyzing a situation more formally, we often rely on reasoning by analogy to make key choices.

A. What is reasoning by analogy? 1. Analogical reasoning is when we assess a situation and then liken it to a similar situation that we have seen in the past. 2. We consider what worked, as well as what didn’t work, in that past situation. 3. Then, based on that assessment, we make a choice about what to do—and what definitely not to do—in the current situation. B. Why is it so powerful? 1. Analogical reasoning can save us time, because we do not necessarily have to start from scratch in search of a solution to a complex problem. 2.

Analogical reasoning enables us to look back historically and avoid repeating old mistakes. 3. It also enables us to leverage past successful choices to identify best practices. 4. Research also shows that some of the most innovative ideas come when we think outside of our field of expertise and make analogies to situations in completely different domains. The analogy thus can be a powerful source of divergent thinking. C. Why can analogical reasoning be troublesome? 1. Research shows that we tend to focus on the similarities between the 2 analogous situations and downplay or ignore the differences. 2.

We also become overly enamored with highly salient analogies that have left an indelible imprint on us in the past, even when those analogies may not fit the current situation. 3. We do not surface and seek to validate our underlying assumptions that are embedded in the analogical reasoning. II. What are some examples of faulty reasoning by analogy? A. Richard Neustadt and Ernest May have done some of the groundbreaking work on analogical reasoning. They refer back to the Munich analogy, which so many political leaders refer to time and time again. 1. The Munich analogy refers to Neville Chamberlain’s appeasement of Hitler in the late 1930s. . Whenever a dictator engages in an aggressive maneuver, we hear leaders hearken back to the Munich situation. They argue that we should confront, not appease, the dictator given the lessons of Hitler in the 1930s. 3. Neustadt and May argue that we overuse the analogy. 4. They give an example of one leader who used it well but did not fully explore the analogy, leading to later errors. Their example is Truman with regard to Korea. 5. Analogical reasoning, with reference to Munich, led Truman to rightfully stand up and defend South Korea, according to these 2 scholars. 6.

However, failing to completely vet the analogy led Truman to later endorse a move to try to unify the Korean Peninsula—not an original objective of the war effort. 7. This miscalculation led to Chinese entry into the war and the long stalemate that followed. B. Many business leaders also have fallen down when they have reasoned by analogy. 1. There is the example of Staples founder Tom Stemberg and early Staples employee Todd Krasnow, who launched the dry-cleaning chain called Zoots. 2. They explicitly drew analogies to the office supplies market back before Staples and other superstores were ormed. ©2009 The Teaching Company. 15 3. 4. 5. 6. 7. 8. The analogy proved not to be a perfect match, and Zoots struggled mightily. Another example is when Pete Slosberg, founder of Pete’s Wicked Ale, tried to move into the specialty chocolate market. Finally, we have the Enron example. The company reasoned by analogy as part of their new business creation strategy. Enron drew analogies to the natural gas market, where they originally had success with their trading model. Poor use of analogies led them far afield, eventually even taking them to the broadband market.

In the Enron case, we see what scholars Jan Rivkin and Giovanni Gavetti describe as “solutions in search of problems. ” The Enron executives did not reason by analogy because they had a problem to solve. Instead, they started with something that had worked in the past, and they searched for new venues that they deemed analogous. The temptation with such efforts is to seriously downplay differences and to focus on similarities, particularly given the incentive schemes at Enron. III. How can we improve our reasoning by analogy? A. Neustadt and May have argued that there are 2 key things that we can do to refine our analogical reasoning. . We can make 2 specific lists: one describing all the likenesses between 2 situations we deem to be analogous and another describing the differences. 2. Their second technique is to write down (and clearly distinguish), that which is known, unknown, and presumed in the situation. The objective is to clearly separate fact from assumption and to then probe the presumptions carefully. B. Writing these lists down is critically important. 1. We need to be very methodical in writing down these lists, because it forces us to be much more careful in our thinking. We protect against sloppy analogical reasoning this way. 2.

Moreover, by writing these lists down, we help others conduct careful critiques of our thinking. C. We can accelerate and enhance our analogical reasoning capabilities. 1. Certain types of learning experiences can help us refine our analogical reasoning abilities. For instance, one benefit of the case method is that it exposes us vicariously to many, many different situations. 2. Over time, we can compare and contrast those situations and try to apply past experience to new case studies we examine. 3. We become better and better at recognizing patterns, and we refine our ability to distinguish useful analogies from dangerous ones. . In a way, business education is as much about refining our intuition, and specifically our analogical reasoning capabilities, as it is about learning formal analytical frameworks. Suggested Reading: Neustadt and May, Thinking in Time. Salter, Innovation Corrupted. Questions to Consider: 1. What are some of the dangers of reasoning by analogy? 2. What types of analogies are most salient? 3. How can we sharpen our analogical reasoning? 16 ©2009 The Teaching Company. Lecture Seven Making Sense of Ambiguous Situations Scope: Until now, we