Strategy. Innovation. Brand.

applied critical thinking

Thinking With Your Thumbs – Part 1

Do you bite your thumb at me, sir?

Can we think with our thumbs? Well, metaphorically we do. When we use System 1 — our fast, automatic, energy-efficient thinking system — we use heuristics, shortcuts to get to an answer that is “good enough”. We often refer to heuristics as rules of thumb — rough and ready ways to deal with reality. (For a comparison of System 1 versus System 2, click here).

Our rules of thumb work most of the time but not all of the time. Psychologists have classified 17 different errors that we make when we use System 1. Let’s look at three today.

Satisficing and temporizing are two errors that often go hand in hand. Satisficing simply means that when we find a choice that’s good enough, we take it and don’t search any farther. (The definition of “good enough” is entirely up to you.) Defense lawyers regularly accuse the police of satsificing. The accusation goes something like this: “You found my client and decided that he committed the crime. You stopped looking for any other suspects. You let the real criminal get away.”

Temporizing is similar to satisficing but adds a time dimension. You’re temporizing when you choose an option that’s good enough for now. How much education do you need? Well, let’s say that you can get a good job immediately with only a Bachelor’s degree. It’s good enough for now. But, 20 years from now you may not be able to get the promotion you want because you don’t have a Master’s degree. You may regret that you temporized in your younger years.

If you ever hear someone say, “if it ain’t broke, don’t fix it” you may well conclude that they’re either satisficing or temporizing. Whatever “it” is, it’s good enough for now.

Availability is another error category that we encounter often. When we’re asked a difficult question, we often search our memory banks for cases that would help us develop an answer. If we can recall cases easily, we tend to overestimate the probability that the same phenomenon will occur again. In other words, if the cases are readily available (to our memory), we tend to exaggerate their probability. This is especially true with vivid memories. This is one reason that people tend to overestimate the crime rate in their communities. Recent crimes are readily recalled — you read about them in the papers every day. Gruesome crimes create vivid memories — thus, many people think that gruesome crimes occur far more frequently than they do.

Available memories don’t have to be recent. In fact, vivid memories can last for years and affect our judgment and behavior in subtle ways. Indeed, I still go easy on tequila because of vivid memories from college days.

Satsificing, temporizing, and availability are three rules of thumb that help us get through the day. They’re part of System 1 which we can’t turn off, so we’re always vulnerable to these types of errors. In general, the benefits of System 1 outweigh the costs but you should be aware of the costs. If the costs are getting out of hand, it’s time to switch on System 2.

I drew primarily on two sources for composing this article. First, Peter Facione’s Think Critically. (Click here)  Second, Daniel Kahneman’s Thinking Fast and Slow. (Click here).

Thinking: System 1 and System 2

Think Fast.

Do you know how you think? It’s both simpler and more complicated than you might imagine.

It turns out that we have not one but two thinking systems. One is fast, automatic, and doesn’t require much energy. The other is slow, requires a lot of energy, and activates only when needed. Both systems are naturally good at grammar. Neither system is good at statistics.

Why do we need two systems? Because much of our life is filled with routine. Rather than using our slow, energy-hungry system to deal with routine matters, we’ve developed a fast, energy-efficient system to handle daily activities. That’s why we can drive 50 miles and not remember any of it. That’s why we can enjoy a walk in the park while our mind is in some other universe — being creative no doubt.

Notice, however, that what’s routine to one person is exotic and complicated to another. What’s routine to an airline pilot would be complicated, confusing, and downright scary to me. We train our fast, energy efficient system with our experiences. As we gain experience, more things become routine. We can do more things on auto-pilot, while turning over the creative thinking to our energy-hungry system. That may be why Earl Weaver, former manager of the Baltimore Orioles, titled his autobiography, It’s What You Learn After You Know It All That Counts.

Psychologists have named our two thinking systems — rather prosaically — System 1 and System 2. System 1 is fast and always on. You can’t tun it off. System 2 engages at various times — especially when System 1 encounters something out of the ordinary.

System 1 knows the “default” value — if everything is routine, just select the default action. To select something other than the default value, you typically need to fire up System 2. In Thinking Fast and Slow, Daniel Kahneman tells a story about parole boards in Israel. For parole judges, the default value is to deny parole. Judges have to find something positive and think through their position to approve parole. As a result, you’re much more likely to be approved for parole if your case is heard right after lunch. Immediately after eating, the judges have a lot of fuel for their brains and find it much easier to activate System 2. Thus, it’s easier to override the default position.

While System 1 is critical to our daily living, it’s also prone to specific types of errors. Indeed, psychologists have cataloged 17 different classes of System 1 errors.  As we probe more deeply into critical thinking, I’ll provide an overview of all 17 and will delve more deeply into a few of the more common issues. Each time I review the list, I can recall a whole host of errors that I’ve made. Frankly, I’ll probably continue to make similar errors in the future. By understanding the types of errors I might make, however, I can check myself and maybe activate System 2 more frequently. As you read through the 17 types of System 1 errors, think about your own experiences. If you have good examples, please share them.

I drew primarily on two sources for composing this article. First, is Daniel Kahneman’s Thinking Fast and Slow. (Click here). Second, is Peter Facione’s Think Critically. (Click here)

 

Do Generals Stray More Than Teachers?

Do generals commit adultery more often than, say, elementary school teachers?

The way we answer this question says a lot about the way we think. If you’ve been reading about American generals recently, you know that a lot of top ranking officers have been caught with their hands in the cookie jar. The facts are easily available to you. You can recall them quickly. Indeed, they’re very likely top of mind. (One of my students asked, in mock horror, since when have generals taken orders from their privates?)

On the other hand, when was the last time you read about cheating primary school teachers? It’s probably been a long time, if ever. Why? Because stories about cheating teachers don’t sell many newspapers. Stories about cheating generals seize our attention and hold it. It’s a great way to sell newspapers, magazines, and TV shows.

So, it’s easy for you to remember stories about cheating generals. It’s much harder to remember stories about cheating teachers. Based on your ability to remember relevant cases, you might conclude that generals do indeed stray more often than teachers. Would you be right? Maybe … but maybe not. All you’ve really done is search your own memory banks. As we all know, memory is fallible and can easily play tricks on us.

When we’re asked a comparative question like generals versus teachers, we often try to answer a different question: how many cases of each can I readily recall? It’s an easier question to answer and doesn’t require us to search external sources and think hard thoughts. Though it’s easy, it’s often erroneous.

I think I saw this phenomenon in action during the recent presidential election. My friends who supported Obama tended to talk to other people who supported Obama. If you asked how many people would support Obama, they could readily retrieve many cases and conclude that Obama would win. Of course, my friends who supported Romney were doing exactly the same thing — talking with or listening to other Romney supporters. I heard one person say, “Of course Romney will win. Everybody hates Obama”. I suspect that everybody he talked to hated Obama. But that’s not the same as everybody.

Relying on easily available information can help create the political chasms that we see around us. If you read a lot of articles about intransigent Republicans, you may conclude that Republicans are more intransigent than Democrats. That may be true … or it could just be a product of what you remember. Similarly, if you read lots of articles about Democrats undercutting the military, you might come to believe …. well, you get the picture.

What should we do? First, remember that the easy answer is often the wrong answer. It depends on what we remember rather than what’s actually happening. Second, start reading more sources that “disagree” with your point of view. All information sources have some degree of bias. Reading widely can help you establish a balance. Third, study up on statistics. It will help you understand what’s accurate and what’s not.

By the way, this post is adapted from Thinking Fast and Slow by Daniel Kahneman, easily the best book I’ve read this year. You can find it here.

(Note: I’ll teach a class on Applied Critical Thinking during the winter term at the University of Denver. Some of my teaching material will show up here in posts about how we think. They’ll all carry the tag, Applied Critical Thinking, so you can find them easily).

 

My Social Media

YouTube Twitter Facebook LinkedIn

Newsletter Signup
Archives