Last week a man was swallowed by a sinkhole while sleeping in Florida. This week, I’m more worried about sinkholes in Florida than I am about driving on icy roads in Colorado. Is that logical?
It’s not logical but it’s very real. Sometimes a story is so vivid, so unexpected, so emotionally fraught, and so available that it dominates our thinking. Even though it’s extremely unlikely, it becomes possible, maybe even probable in our imaginations. As Daniel Kahneman points out, “The world in our heads is not a precise replica of reality.”
What makes a phenomenon more real in our heads than it is in reality? Several things. It’s vivid — it creates a very clear image in our mind. It’s creepy — the vivid image is unpleasant and scary. It’s a “bad” death as opposed to a “good” death. We read about deaths every day. When we read about a kindly old nonagenarian dying peacefully after a lifetime of doing good works, it seems natural and honorable. It’s a good death. When we read about someone killed in the prime of life in bizarre or malevolent circumstances, it’s a “bad” death. A bad death is much more vivid than a good death.
But what really makes an image dominate our minds is availability. How easy is it to bring an instance to mind? If the thought is readily available to us, we deem it to be likely. What’s readily available? Anything that’s in the popular media and the topic of discussion with friends and colleagues. If your colleagues around the water cooler say, “Hey, did you hear about the guy in the sinkhole?” you’ve already begun to blow it out of proportion.
Availability can also compound itself in what Kahneman calls an availability cascade. The story itself becomes the story. Suppose that a suspicious compound — let’s call it chenesium — is found in the food supply. Someone writes that chenesium causes cancer in rats when administered in huge doses. Plus, it’s a vividly scary form of cancer — it affects the eyeballs and makes you look like a zombie. People start writing letters to the editor about the grave danger. Now it’s in the media. People start marching on state capitals, demanding action. The media write about the marching. People read about the marching and assume that where there’s smoke, there’s fire. The Surgeon General issues a statement saying the danger is minimal. But the populace — now worked into a frenzy — denounce her as a lackey of chenesium producers. Note that the media is no longer writing about chenesium. Rather, they’re writing about the controversy surrounding chenesium. The story keeps growing because it’s a good story. It’s a perfect storm.
So, what to do? Unfortunately, facts don’t matter a whole lot by this point. As Kahneman notes (quoting Jonathan Haidt), “The emotional tail wags the rational dog.” The only thing to do is to let it play out … sooner or later, another controversy will arise to take chenesium’s place.
At the personal level, we can spare ourselves a lot of worry by pondering the availability bias and remembering that facts do matter. We can look up the probability of succumbing to a sinkhole. If we do, we’ll realize that the danger is vanishingly small. There’s nothing to worry about. Still, I’m not going to Florida anytime soon.
In his book, Thinking Fast and Slow, Daniel Kahneman has an interesting example of a heuristic bias. Read the description, then answer the question.
Steve is very shy and withdrawn, invariably helpful but with little interest in people or in the world of reality. A meek and tidy soul, he has a need for order and structure, and a passion for detail.
Is Steve more likely to be a librarian or a farmer?
I used this example in my critical thinking class the other night. About two-thirds of the students guessed that Steve is a librarian; one-third said he’s a farmer. As we debated Steve’s profession, the class focused exclusively on the information in the simple description.
Kahneman’s example illustrates two problems with the rules of thumb (heuristics) that are often associated with our System 1 thinking. The first is simply stereotyping. The description fits our widely held stereotype of male librarians. It’s easy to conclude that Steve fits the stereotype. Therefore, he must be a librarian.
The second problem is more subtle — what evidence do we use to draw a conclusion? In the class, no one asked for additional information. (This is partially because I encouraged them to reach a decision quickly. They did what their teacher asked them to do. Not always a good idea.) Rather they used the information that was available. This is often known as the availability bias — we make a decision based on the information that’s readily available to us. As it happens, male farmers in the United States outnumber male librarians by a ratio of about 20 to 1. If my students had asked about this, they might have concluded that Steve is probably a farmer — statistically at least.
The availability bias can get you into big trouble in business. To illustrate, I’ll draw on an example (somewhat paraphrased) from Paul Nutt’s book, Why Decisions Fail.
Peca Products is locked in a fierce competitive battle with its archrival, Frangro Enterprises. Peca has lost 4% market share over the past three quarters. Frangro has added 4% in the same period. A board member at Peca — a seasoned and respected business veteran — grows alarmed and concludes that Peca has a quality problem. She sends memos to the executive team saying, “We have to solve our quality problem and we have to do it now!” The executive team starts chasing down the quality issues.
The Peca Products executive team is falling into the availability trap. Because someone who is known to be smart and savvy and experienced says the company has a quality problem, the executives believe that the company has a quality problem. But what if it’s a customer service problem? Or a logistics problem? Peca’s executives may well be solving exactly the wrong problem. No one stopped to ask for additional information. Rather, they relied on the available information. After all, it came from a trusted source.
So, what to do? The first thing to remember in making any significant decision is to ask questions. It’s not enough to ask questions about the information you have. You also need to seek out additional information. Questioning also allows you to challenge a superior in a politically acceptable manner. Rather than saying “you’re wrong!” (and maybe getting fired), you can ask, “Why do you think that? What leads you to believe that we have a quality problem?” Proverbs says that “a gentle answer turneth away wrath”. So does an insightful question.
Can we think with our thumbs? Well, metaphorically we do. When we use System 1 — our fast, automatic, energy-efficient thinking system — we use heuristics, shortcuts to get to an answer that is “good enough”. We often refer to heuristics as rules of thumb — rough and ready ways to deal with reality. (For a comparison of System 1 versus System 2, click here).
Our rules of thumb work most of the time but not all of the time. Psychologists have classified 17 different errors that we make when we use System 1. Let’s look at three today.
Satisficing and temporizing are two errors that often go hand in hand. Satisficing simply means that when we find a choice that’s good enough, we take it and don’t search any farther. (The definition of “good enough” is entirely up to you.) Defense lawyers regularly accuse the police of satsificing. The accusation goes something like this: “You found my client and decided that he committed the crime. You stopped looking for any other suspects. You let the real criminal get away.”
Temporizing is similar to satisficing but adds a time dimension. You’re temporizing when you choose an option that’s good enough for now. How much education do you need? Well, let’s say that you can get a good job immediately with only a Bachelor’s degree. It’s good enough for now. But, 20 years from now you may not be able to get the promotion you want because you don’t have a Master’s degree. You may regret that you temporized in your younger years.
If you ever hear someone say, “if it ain’t broke, don’t fix it” you may well conclude that they’re either satisficing or temporizing. Whatever “it” is, it’s good enough for now.
Availability is another error category that we encounter often. When we’re asked a difficult question, we often search our memory banks for cases that would help us develop an answer. If we can recall cases easily, we tend to overestimate the probability that the same phenomenon will occur again. In other words, if the cases are readily available (to our memory), we tend to exaggerate their probability. This is especially true with vivid memories. This is one reason that people tend to overestimate the crime rate in their communities. Recent crimes are readily recalled — you read about them in the papers every day. Gruesome crimes create vivid memories — thus, many people think that gruesome crimes occur far more frequently than they do.
Available memories don’t have to be recent. In fact, vivid memories can last for years and affect our judgment and behavior in subtle ways. Indeed, I still go easy on tequila because of vivid memories from college days.
Satsificing, temporizing, and availability are three rules of thumb that help us get through the day. They’re part of System 1 which we can’t turn off, so we’re always vulnerable to these types of errors. In general, the benefits of System 1 outweigh the costs but you should be aware of the costs. If the costs are getting out of hand, it’s time to switch on System 2.