Strategy. Innovation. Brand.

rules of thumb

Sinkholes, Icy Roads, and Chenesium

This is nothing. I'm much more worried about sinkholes.

This is nothing. I’m much more worried about sinkholes.

Last week a man was swallowed by a sinkhole while sleeping in Florida. This week, I’m more worried about sinkholes in Florida than I am about driving on icy roads in Colorado. Is that logical?

It’s not logical but it’s very real. Sometimes a story is so vivid, so unexpected, so emotionally fraught, and so available that it dominates our thinking. Even though it’s extremely unlikely, it becomes possible, maybe even probable in our imaginations. As Daniel Kahneman points out, “The world in our heads is not a precise replica of reality.”

What makes a phenomenon more real in our heads than it is in reality? Several things. It’s vivid — it creates a very clear image in our mind. It’s creepy — the vivid image is unpleasant and scary. It’s a “bad” death as opposed to a “good” death. We read about deaths every day. When we read about a kindly old nonagenarian dying peacefully after a lifetime of doing good works, it seems natural and honorable. It’s a good death. When we read about someone killed in the prime of life in bizarre or malevolent circumstances, it’s a “bad” death. A bad death is much more vivid than a good death.

But what really makes an image dominate our minds is availability. How easy is it to bring an instance to mind? If the thought is readily available to us, we deem it to be likely. What’s readily available? Anything that’s in the popular media and the topic of discussion with friends and colleagues. If your colleagues around the water cooler say, “Hey, did you hear about the guy in the sinkhole?” you’ve already begun to blow it out of proportion.

Availability can also compound itself in what Kahneman calls an availability cascade. The story itself becomes the story. Suppose that a suspicious compound — let’s call it chenesium — is found in the food supply. Someone writes that chenesium causes cancer in rats when administered in huge doses. Plus, it’s a vividly scary form of cancer — it affects the eyeballs and makes you look like a zombie. People start writing letters to the editor about the grave danger. Now it’s in the media. People start marching on state capitals, demanding action. The media write about the marching. People read about the marching and assume that where there’s smoke, there’s fire. The Surgeon General issues a statement saying the danger is minimal. But the populace — now worked into a frenzy — denounce her as a lackey of chenesium producers. Note that the media is no longer writing about chenesium. Rather, they’re writing about the controversy surrounding chenesium. The story keeps growing because it’s a good story. It’s a perfect storm.

So, what to do? Unfortunately, facts don’t matter a whole lot by this point. As Kahneman notes (quoting Jonathan Haidt), “The emotional tail wags the rational dog.” The only thing to do is to let it play out … sooner or later, another controversy will arise to take chenesium’s place.

At the personal level, we can spare ourselves a lot of worry by pondering the availability bias and remembering that facts do matter. We can look up the probability of succumbing to a sinkhole. If we do, we’ll realize that the danger is vanishingly small. There’s nothing to worry about. Still, I’m not going to Florida anytime soon.

Critical Thinking. What’s That?

No, thanks. I don't want my doc to mis-diagnose me.

No, thanks. I don’t want my doc to mis-diagnose me.

I visited my doctor the other day. Expecting a bit of a wait, I took along Critical Thinking, a textbook for one of my courses. When I walked into the doctor’s office, he read the title and said, “Hmmm … critical thinking. What’s that?” I thought, “My doc just asked me what critical thinking is. This can’t be a good sign.”

I quelled my qualms, however, and explained what I teach in critical thinking class. He brightened up immediately and said, “Oh, that’s just like How Doctors Think.” I pulled out my Kindle and downloaded the book immediately. Understanding how doctors think might actually help me get better medical care.

So how do they think? Well, first they use shortcuts. They generally have way too much information to deal with and use rules of thumb called heuristics. Sound familiar? I’ve written several articles about rules of thumb and how they can lead us astray. (Just look for “thumb” in this website’s Blog Search box). So, the first answer is that doctors think just like us. Is that a good thing? Here are some errors that doctors commonly make:

Representation error — the patient is a picture of health. It’s not likely that those chest pains are a cause for concern. With this error, the doctor identifies a prototype that represents a cluster of characteristics. If you fit the prototype, fine. If not, the doctor may be diagnosing the prototype rather than you.

Attribution error — this often happens with negative stereotypes. The patient is disheveled and smells of booze. Therefore, the tremors are likely caused by alcohol rather than a hereditary disease that causes copper accumulation in the liver. That may be right most of the time but when it’s wrong, it’s really wrong.

Framing errors — I’ve read the patient’s medical charts and I see that she suffers from XYZ. Therefore, we’ll treat her for XYZ. The medical record forms a frame around the patient. Sometimes, doctors forget to step outside the frame and ask about other conditions that might have popped up. Sometimes the best approach is simply to say, “Let me tell you my story.”

Confirmation bias — we see things that confirm our beliefs and don’t see (or ignore) things that don’t. We all do it.

Availability bias — if you’re the 7th patient I’ve seen today and the first six all had the flu, there’s a good chance that I’ll diagnose you with flu, too. It just comes to mind easily; it’s readily available.

Affective bias — the doctor’s emotions get in the way. Sometimes these are negative emotions. (Tip: if you think your doctor feels negatively about you, get a new doctor). But positive emotions can also be harmful. I like you and I don’t want to cause you pain. Therefore, I won’t order that painful, embarrassing test — the one that might just save your life.

Sickest patient syndrome — doctors like to succeed just like anyone else does. With very sick patients, they may subconsciously conclude that they can’t be successful … and do less than their best.

The list goes on … but my space doesn’t. When I started  the book I thought it was probably written for doctors. But the author, Jerome Groopman, says it’s really for us laypeople. By understanding how doctors think, we can communicate more effectively with our physicians and help them avoid mistakes. It’s a good thought and a fun read.

Premature Commitment. It’s a Guy Thing.

Sin in haste. Repent at leisure.

Sin in haste. Repent at leisure.

There’s a widespread meme in American culture that guys are not good at making commitments. While that may be true in romance, it seems that the opposite — premature commitment — is a much bigger problem in business.

That’s the opinion I’ve formed from reading Paul Nutt’s book, Why Decisions Fail. Nutt analyzes 15 debacles which he defines as “… costly decisions that went very wrong and became public…” Indeed, some of the debacles — the Ford Pinto, Nestlé infant formula, Shell Oil’s Brent Spar disposal —  not only went public but created firestorms of indignation.

While each debacle had its own special set of circumstances, each also had one common feature: premature commitment. Decision makers were looking for ideas, found one that seemed to work, latched on to it, pursued it, and ignored other equally valid alternatives. Nutt doesn’t use the terminology, but in critical thinking circles this is known as satisficing or temporizing.

Here are two examples from Nutt’s book:

Ohio State University and Big Science — OSU wanted to improve its offerings (and its reputation) in Big Science. At the same time, professors in the university’s astronomy department were campaigning for a new observatory. The university’s administrators latched on to the observatory idea and pursued it, to the exclusion of other ideas. It turns out that Ohio is not a great place to build an observatory. On the other hand, Arizona is. As the idea developed, it became an OSU project to build an observatory in Arizona. Not surprisingly, the Ohio legislature asked why Ohio taxes were being spent in Arizona. It went downhill from there.

EuroDisney — Disney had opened very successful theme parks in Anaheim, Orlando, and Tokyo and sought to replicate their success in Europe. Though they considered some 200 sites, they quickly narrowed the list to the Paris area. Disney executives let it be known that it had always been “Walt’s dream” to build near Paris. Disney pursued the dream rather than closely studying the local situation. For instance, they had generated hotel revenues in their other parks. Why wouldn’t they do the same in Paris? Well… because Paris already had lots of hotel rooms and an excellent public transportation system. So, visitors saw EuroDisney as a day trip, not an overnight destination. Disney officials might have taken fair warning from an early press conference in Paris featuring their CEO, Michael Eisner. He was pelted with French cheese.

In both these cases — and all the other cases cited by Nutt —  executives rushed to judgment. As Nutt points out, they then compounded their error by misusing their resources. Instead of using resources to identify and evaluate alternatives, they invested their money and energy in studies designed to justify the alternative they had already selected.

So, what to do? When approaching a major decision, don’t satsifice. Don’t take the first idea that comes along, no matter how attractive it is. Rather, take a step back. (I often do this literally — it does affect your thinking). Ask the bigger question — What’s the best way to improve Big Science at OSU? — rather than the narrower question — What’s the best way to build a telescope?

 

Librarian, Farmer, Debacle

It's a quality issue.

It’s a quality issue.

In his book, Thinking Fast and Slow, Daniel Kahneman has an interesting example of a heuristic bias. Read the description, then answer the question.

Steve is very shy and withdrawn, invariably helpful but with little interest in people or in the world of reality. A meek and tidy soul, he has a need for order and structure, and a passion for detail.

Is Steve more likely to be a librarian or a farmer?

I used this example in my critical thinking class the other night. About two-thirds of the students guessed that Steve is a librarian; one-third said he’s a farmer. As we debated Steve’s profession, the class focused exclusively on the information in the simple description.

Kahneman’s example illustrates two problems with the rules of thumb (heuristics) that are often associated with our System 1 thinking. The first is simply stereotyping. The description fits our widely held stereotype of male librarians. It’s easy to  conclude that Steve fits the stereotype. Therefore, he must be a librarian.

The second problem is more subtle — what evidence do we use to draw a conclusion? In the class, no one asked for additional information. (This is partially because I encouraged them to reach a decision quickly. They did what their teacher asked them to do. Not always a good idea.) Rather they used the information that was available. This is often known as the availability bias — we make a decision based on the information that’s readily available to us. As it happens, male farmers in the United States outnumber male librarians by a ratio of about 20 to 1. If my students had asked about this, they might have concluded that Steve is probably a farmer — statistically at least.

The availability bias can get you into big trouble in business. To illustrate, I’ll draw on an example (somewhat paraphrased) from Paul Nutt’s book, Why Decisions Fail.

Peca Products is locked in a fierce competitive battle with its archrival, Frangro Enterprises. Peca has lost 4% market share over the past three quarters. Frangro has added 4% in the same period. A board member at Peca — a seasoned and respected business veteran — grows alarmed and concludes that Peca has a quality problem. She sends memos to the executive team saying, “We have to solve our quality problem and we have to do it now!” The executive team starts chasing down the quality issues.

The Peca Products executive team is falling into the availability trap. Because someone who is known to be smart and savvy and experienced says the company has a quality problem, the executives believe that the company has a quality problem. But what if it’s a customer service problem? Or a logistics problem? Peca’s executives may well be solving exactly the wrong problem. No one stopped to ask for additional information. Rather, they relied on the available information. After all, it came from a trusted source.

So, what to do? The first thing to remember in making any significant decision is to ask questions. It’s not enough to ask questions about the information you have. You also need to seek out additional information. Questioning also allows you to challenge a superior in a politically acceptable manner. Rather than saying “you’re wrong!” (and maybe getting fired), you can ask, “Why do you think that? What leads you to believe that we have a quality problem?” Proverbs says that “a gentle answer turneth away wrath”. So does an insightful question.

 

 

The Last of Thumb Thinking

Sure it’s dangerous. But we’re in control. No problem!

Heuristics are simply rules of thumb. They help us make decisions quickly and, in most cases, accurately. They help guide us through the day. Most often, we’re not even aware that we’re making decisions. Unfortunately, some of those decisions can go haywire — precisely because we’re operating on automatic pilot. In fact, psychologists suggest that we commonly make 17 errors via heuristics. I’ve surveyed 11 of them in previous posts (click here, here, and here). Let’s look at the last six today.

Optimistic bias — can get us into a lot of trouble. It leads us to conclude that we have much more control over dangerous situations than we actually do. We underestimate our risks. This can help us when we’re starting a new company; if we knew the real odds, we might never try. It can hurt us, however, when we’re trying to estimate the danger of cliff diving.

Hindsight bias — can also get us into trouble. Everything we did that was successful is by dint of our own hard work, talent, and skill. Everything we did that was unsuccessful was because of bad luck or someone else’s failures. Overconfidence, anyone?

Elimination by aspect — you’re considering multiple options and drop one of them because of a single issue. Often called, the “one-strike-and-you’re-out” heuristic. I think of it as the opposite of satsificing. With satsificing, you jump on the first solution that comes along. With elimination by aspect, you drop an option at the first sign of a problem. Either way, you’re making decisions too quickly.

Anchoring with adjustment — I call this the “first-impressions-never-die” heuristic and I worry about it when I’m grading papers. Let’s say I give Harry a C on his first paper. That becomes an anchor point. When his second paper arrives, I run the risk of simply adjusting upward or downward from the anchor point. If the second paper is outstanding, I might just conclude that it’s a fluke. But it’s equally logical to assume that the first paper was a fluke while the second paper is more typical of Harry’s work. Time to weigh anchor.

Stereotyping — we all know this one: to judge an entire group based on a single instance. I got in an accident and a student from the University of Denver went out of her way to help me out. Therefore, all University of Denver students must be altruistic and helpful. Or the flip side: I had a bad experience with Delta Airlines, therefore, I’ll never fly Delta again. It seems to me that we make more negative stereotyping mistakes than positive ones.

All or Nothing — the risk of something happening is fairly low. In fact, it’s so low that we don’t have to account for it in our planning. It’s probably not going to happen. We don’t need to prepare for that eventuality. When I cross the street, there’s a very low probability that I’ll get hit by a car. So why bother to look both ways?

As with my other posts on heuristics and systematics biases, I relied heavily on Peter Facione’s book, Think Critically, to prepare this post. You can find it here.

 

 

My Social Media

YouTube Twitter Facebook LinkedIn

Newsletter Signup
Archives