Strategy. Innovation. Brand.

Critical Thinking

Thinking: System 1 and System 2

Think Fast.

Do you know how you think? It’s both simpler and more complicated than you might imagine.

It turns out that we have not one but two thinking systems. One is fast, automatic, and doesn’t require much energy. The other is slow, requires a lot of energy, and activates only when needed. Both systems are naturally good at grammar. Neither system is good at statistics.

Why do we need two systems? Because much of our life is filled with routine. Rather than using our slow, energy-hungry system to deal with routine matters, we’ve developed a fast, energy-efficient system to handle daily activities. That’s why we can drive 50 miles and not remember any of it. That’s why we can enjoy a walk in the park while our mind is in some other universe — being creative no doubt.

Notice, however, that what’s routine to one person is exotic and complicated to another. What’s routine to an airline pilot would be complicated, confusing, and downright scary to me. We train our fast, energy efficient system with our experiences. As we gain experience, more things become routine. We can do more things on auto-pilot, while turning over the creative thinking to our energy-hungry system. That may be why Earl Weaver, former manager of the Baltimore Orioles, titled his autobiography, It’s What You Learn After You Know It All That Counts.

Psychologists have named our two thinking systems — rather prosaically — System 1 and System 2. System 1 is fast and always on. You can’t tun it off. System 2 engages at various times — especially when System 1 encounters something out of the ordinary.

System 1 knows the “default” value — if everything is routine, just select the default action. To select something other than the default value, you typically need to fire up System 2. In Thinking Fast and Slow, Daniel Kahneman tells a story about parole boards in Israel. For parole judges, the default value is to deny parole. Judges have to find something positive and think through their position to approve parole. As a result, you’re much more likely to be approved for parole if your case is heard right after lunch. Immediately after eating, the judges have a lot of fuel for their brains and find it much easier to activate System 2. Thus, it’s easier to override the default position.

While System 1 is critical to our daily living, it’s also prone to specific types of errors. Indeed, psychologists have cataloged 17 different classes of System 1 errors.  As we probe more deeply into critical thinking, I’ll provide an overview of all 17 and will delve more deeply into a few of the more common issues. Each time I review the list, I can recall a whole host of errors that I’ve made. Frankly, I’ll probably continue to make similar errors in the future. By understanding the types of errors I might make, however, I can check myself and maybe activate System 2 more frequently. As you read through the 17 types of System 1 errors, think about your own experiences. If you have good examples, please share them.

I drew primarily on two sources for composing this article. First, is Daniel Kahneman’s Thinking Fast and Slow. (Click here). Second, is Peter Facione’s Think Critically. (Click here)

 

As we become more energy efficient, do we use less energy?

We buy my wife, Suellen, a new car every ten years whether she needs one or not. It’s always a red Volkswagen convertible with a manual transmission. (She’s old school). We bought the first one in 1985, another one in 1995, and the current one in 2005. She just looks cute in a red convertible.

What struck me about these three cars was how power was deployed. The 1985 model had roll-up windows and a manually operated roof. The ’95 had power windows and a manual roof. The ’05 has both power windows and a power roof. The 2005 model is clearly more energy efficient than the ’85 model but does it use less energy?

I didn’t know it at the time but I had struck on something called Jevons paradox (also known as the rebound effect). Named after William Stanley Jevons (pictured), a British economist in the mid-19th century, the paradox states that increasing energy efficiency leads to greater energy use. It’s basically a cost curve. Increased energy efficiency means each unit of energy costs less. As costs decline, we buy more energy — just as we do with most commodities.

The result: instead of raising and lowering the convertible’s roof with arm power, we put in a little motor to do it for us. According to Jevons, the net effect is that we use more energy rather than less.

I’ll admit that I’m a tree hugger and that I’m concerned about global warming. I also study the processes of innovation and I generally applaud innovations that result in greater energy efficiency. But the more I ponder Jevons paradox, I wonder if we’ve aimed our innovations at the wrong target. Shouldn’t we be creating innovations that help us use less energy rather than more?

(If you want to read more about Jevons paradox, The New Yorker has a terrific article here. On the other hand, Think Progress says the paradox exists in theory but not in practice. That article is here. Interesting reading).

 

The Art of the Wrong View

In one of my classes at the University of Denver, I try to teach my students how to manage technologies that constantly morph and change. They’re unpredictable, they’re slippery, and managing them effectively can make the difference between success and failure.

The students, of course, want to predict the future so they can prepare for it.  I try to convince them that predicting the future is impossible. But they’re young. They can explain the past, so why can’t they predict the future?

To help them prepare for the future — though not predict it — I often teach the techniques of scenario planning. You tell structured stories about the future and then work through them logically to understand which way the world might tilt. The technique has common building blocks, often referred to as PESTLE.  Your stories need to incorporate political, economic, societal, technical, legal, and environmental frameworks. This helps ensure that you don’t overlook anything.

I’ve used scenario planning a number of times and it has always helped me think through situations in creative ways – so it seems reasonable to teach it. To prepare for a recent class, I re-read The Art of the Long View by Peter Schwartz. I found it on one of my dustier bookshelves and discovered it was the 1991 edition.  While I remembered many of the main points, I was surprised to find a long chapter titled, “The World in 2005: Three Scenarios”. Here was a chance to see how well the inventor of scenario planning could prepare us for the future.

In sum, I was quite disappointed. The main error was that each scenario vastly overestimated the importance of Japan on the world stage in 2005. In a way, it all makes sense. The author was writing in 1991, when we all believed that Japan might just surpass every other economy on earth. Of course, he would assume that Japan would still dominate in 2005. Of course, he was wrong.

So what can we learn from this?  Two things, I think:

  1. Always remember to ask the reverse question. If it’s “obvious” that a trend will continue (e.g. Japan will dominate) always remember to ask the non-obvious question: what if it doesn’t? Today, it seems obvious that health care costs will continue to rise for the foreseeable future. But what if we make a medical breakthrough and costs plummet?
  2. Remember that resilience is better than prediction. We’ll never be able to predict the future — partially because we can’t really explain the past. But we can be prepared by building flexible systems that can respond to unexpected jolts. The human immune is probably a good model. This means building systems and organizations where information flows easily, creativity is valued, and leaders can emerge from anywhere.

I’ll continue to teach scenario planning in the future. After all, it’s a good template for thinking and planning. I’ll also be able to provide a very good example of how it can all go wrong.

Sources of Argumentation – Beating Writer’s Block

What should I say?

What should I say?

While preparing for a public speaking engagement, have you ever had writer’s block? Can’t figure out the logic that’s most likely to move your audience? Or maybe it’s the opposite: there are so many possible arguments that it’s difficult to choose among them?  According to the Greeks there are seven basic sources of argumentation. In order of persuasiveness, they are security, health, utility, the five senses, community, emotion, and authority. Learn more about each in this week’s Persuasive Communication Tip of the Week.

My Social Media

YouTube Twitter Facebook LinkedIn

Newsletter Signup
Archives