Strategy. Innovation. Brand.

Critical Thinking

More Thinking on Your Thumbs

Power differential.

Remember heuristics? They’re the rules of thumb that allow us to make snap judgments, using System 1, our fast, automatic, and ever-on thinking system. They can also lead us into errors. According to psychologists there are least 17 errors that we commonly make. In previous articles, I’ve written about seven of them (click here and here). Let’s look at four more today.

Association — word association games are a lot of fun. (Actually, words are a lot of fun). But making associations and then drawing conclusions from them can get you into trouble. You say tomato and I think of the time I ate a tomato salad and got sick. I’m not going to do that again. That’s not good hygiene or good logic. The upside is that word associations can lead you to some creative thinking. You can make connections that you might otherwise have missed. And, as we all know, connections are the foundation of innovation. Just be careful about drawing conclusions.

Power differential — did you ever work for a boss with a powerful personality? Then you know something about this heuristic. Socially and politically, it may be easier to accept an argument made by a “superior authority” than it is to oppose it. It’s natural. We tend to defer to those who have more power or prestige than we do. Indeed, there’s an upside here as well. It’s called group harmony. Sometimes you do need to accept your spouse’s preferences even if they differ from yours. The trick is to recognize when preferences are merely a matter of taste versus preferences that can have significant negative results. As Thomas Jefferson said, “On matters of style, swim with the current. On matters of principle, stand like a rock”.

Illusion of control — how much control do you really have over processes and people at your office? It’s probably a lot less than you think. I’ve worked with executives who think they’ve solved a problem just because they’ve given one good speech. A good speech can help but it’s usually just one  step in a long chain of activities. Here’s a tip for spotting other people who have an illusion of control. They say I much more often than we. It’s poor communication and one of the worst mistakes you can make in a job interview. (Click here for more).

Loss and risk aversion — let’s just keep doing what we’re doing. Let’s not change things … we might be worse off. Why take risks? It happens that risk aversion has a much bigger influence on economic decisions than we once thought. In Thinking Fast and Slow, Daniel Kahneman writes about our unbalanced logic when considering gain versus loss — we fear loss more than we’re attracted by gain. In general terms, the pain of a loss is about double the pleasure of a gain. So, emotionally, it takes a $200 gain to balance a $100 loss. Making 2-to-1 decisions may be good for your nerves but it often means that you’ll pass up good economic opportunities.

To prepare this article, I drew primarily on Peter Facione’s Think Critically. (Click here) Daniel Kahneman’s book is here.

Christmas – Deadliest Day of the Year

Drop by any time.

Here’s a little exercise in critical thinking. More people in the United States die on Christmas Day than on any other day of the year. It’s our deadliest day. New Year’s Day is the second deadliest day.

The first question a critical thinker would ask is: Is that really true? Here’s the evidence: an article published in Social Science and Medicine in October 2010, that analyzed 54 million death certificates from 1979 through 2004. (Click here for the abstract and charts; the full-text is behind a pay wall). Regardless of the setting or the cause, the number of deaths clearly peaks on Christmas Day. This affects all demographic groups, except children.

The next question a critical thinker might ask is: Why would that be? Here the logic gets a little fuzzy. As the authors of the research paper point out, they tested nine different hypotheses but believe more research is necessary. So let’s think about it a bit.

  • Hypothesis 1: Perhaps it’s because people overeat on Christmas Day, overloading the digestive system, causing systemic stress and death. Really? One big meal causes death? If that’s the case, many of us would be long gone already.
  • Hypothesis 2: It’s the stress of having all those family members and in-laws around. True, that’s a lot of stress but a lot of other holidays cause stress as well. If that’s the case, why wouldn’t we also see spikes on Thanksgiving or July 4th?
  • Hypothesis 3: maybe sick people hang on until Christmas and then let go. It’s possible — people can and do keep themselves alive until a big event. But that doesn’t explain why mortality rises in the days and weeks before Christmas. If people were hanging on, you would expect to see a dip in deaths just before Christmas.

The hypothesis I like — which I spotted in the Daily Beast (click here) — is that Christmas isn’t abnormal in terms of life-threatening incidents, but is abnormal in the way people behave when a life-threatening incident occurs. If you feel chest pains on any random day, you may just head straight for the hospital. That’s a good idea because the sooner you get there, the better your chances of survival. On Christmas, however, people may delay, not wanting to spoil the festive atmosphere or leave the family celebration. They may also believe that they’ll get poor service at the hospital on Christmas. The hospital will likely be understaffed or staffed by second stringers, etc. Better to wait ’til tomorrow to get better service.

The next question a critical thinker might ask is: If this is true, what should we do about it, if anything? This hypothesis, of course, is not fully tested. We can’t claim conclusively that it’s true. But there is a certain logic about it. Perhaps enough that we can make Pascal’s wager — the evidence isn’t conclusive but it’s strong enough to make a bet. If we’re wrong we don’t lose much. If we’re right, we can save a lot of lives. So, what do we do? Perhaps we can advertise the phenomena and encourage people to get to the hospital quickly, even if it is Christmas. In fact, consider this article a public service announcement. If you have chest pains today, get your butt to the hospital pronto!

Merry Christmas!

Thinking With Your Thumbs – Part 1

Do you bite your thumb at me, sir?

Can we think with our thumbs? Well, metaphorically we do. When we use System 1 — our fast, automatic, energy-efficient thinking system — we use heuristics, shortcuts to get to an answer that is “good enough”. We often refer to heuristics as rules of thumb — rough and ready ways to deal with reality. (For a comparison of System 1 versus System 2, click here).

Our rules of thumb work most of the time but not all of the time. Psychologists have classified 17 different errors that we make when we use System 1. Let’s look at three today.

Satisficing and temporizing are two errors that often go hand in hand. Satisficing simply means that when we find a choice that’s good enough, we take it and don’t search any farther. (The definition of “good enough” is entirely up to you.) Defense lawyers regularly accuse the police of satsificing. The accusation goes something like this: “You found my client and decided that he committed the crime. You stopped looking for any other suspects. You let the real criminal get away.”

Temporizing is similar to satisficing but adds a time dimension. You’re temporizing when you choose an option that’s good enough for now. How much education do you need? Well, let’s say that you can get a good job immediately with only a Bachelor’s degree. It’s good enough for now. But, 20 years from now you may not be able to get the promotion you want because you don’t have a Master’s degree. You may regret that you temporized in your younger years.

If you ever hear someone say, “if it ain’t broke, don’t fix it” you may well conclude that they’re either satisficing or temporizing. Whatever “it” is, it’s good enough for now.

Availability is another error category that we encounter often. When we’re asked a difficult question, we often search our memory banks for cases that would help us develop an answer. If we can recall cases easily, we tend to overestimate the probability that the same phenomenon will occur again. In other words, if the cases are readily available (to our memory), we tend to exaggerate their probability. This is especially true with vivid memories. This is one reason that people tend to overestimate the crime rate in their communities. Recent crimes are readily recalled — you read about them in the papers every day. Gruesome crimes create vivid memories — thus, many people think that gruesome crimes occur far more frequently than they do.

Available memories don’t have to be recent. In fact, vivid memories can last for years and affect our judgment and behavior in subtle ways. Indeed, I still go easy on tequila because of vivid memories from college days.

Satsificing, temporizing, and availability are three rules of thumb that help us get through the day. They’re part of System 1 which we can’t turn off, so we’re always vulnerable to these types of errors. In general, the benefits of System 1 outweigh the costs but you should be aware of the costs. If the costs are getting out of hand, it’s time to switch on System 2.

I drew primarily on two sources for composing this article. First, Peter Facione’s Think Critically. (Click here)  Second, Daniel Kahneman’s Thinking Fast and Slow. (Click here).

Thinking: System 1 and System 2

Think Fast.

Do you know how you think? It’s both simpler and more complicated than you might imagine.

It turns out that we have not one but two thinking systems. One is fast, automatic, and doesn’t require much energy. The other is slow, requires a lot of energy, and activates only when needed. Both systems are naturally good at grammar. Neither system is good at statistics.

Why do we need two systems? Because much of our life is filled with routine. Rather than using our slow, energy-hungry system to deal with routine matters, we’ve developed a fast, energy-efficient system to handle daily activities. That’s why we can drive 50 miles and not remember any of it. That’s why we can enjoy a walk in the park while our mind is in some other universe — being creative no doubt.

Notice, however, that what’s routine to one person is exotic and complicated to another. What’s routine to an airline pilot would be complicated, confusing, and downright scary to me. We train our fast, energy efficient system with our experiences. As we gain experience, more things become routine. We can do more things on auto-pilot, while turning over the creative thinking to our energy-hungry system. That may be why Earl Weaver, former manager of the Baltimore Orioles, titled his autobiography, It’s What You Learn After You Know It All That Counts.

Psychologists have named our two thinking systems — rather prosaically — System 1 and System 2. System 1 is fast and always on. You can’t tun it off. System 2 engages at various times — especially when System 1 encounters something out of the ordinary.

System 1 knows the “default” value — if everything is routine, just select the default action. To select something other than the default value, you typically need to fire up System 2. In Thinking Fast and Slow, Daniel Kahneman tells a story about parole boards in Israel. For parole judges, the default value is to deny parole. Judges have to find something positive and think through their position to approve parole. As a result, you’re much more likely to be approved for parole if your case is heard right after lunch. Immediately after eating, the judges have a lot of fuel for their brains and find it much easier to activate System 2. Thus, it’s easier to override the default position.

While System 1 is critical to our daily living, it’s also prone to specific types of errors. Indeed, psychologists have cataloged 17 different classes of System 1 errors.  As we probe more deeply into critical thinking, I’ll provide an overview of all 17 and will delve more deeply into a few of the more common issues. Each time I review the list, I can recall a whole host of errors that I’ve made. Frankly, I’ll probably continue to make similar errors in the future. By understanding the types of errors I might make, however, I can check myself and maybe activate System 2 more frequently. As you read through the 17 types of System 1 errors, think about your own experiences. If you have good examples, please share them.

I drew primarily on two sources for composing this article. First, is Daniel Kahneman’s Thinking Fast and Slow. (Click here). Second, is Peter Facione’s Think Critically. (Click here)

 

As we become more energy efficient, do we use less energy?

We buy my wife, Suellen, a new car every ten years whether she needs one or not. It’s always a red Volkswagen convertible with a manual transmission. (She’s old school). We bought the first one in 1985, another one in 1995, and the current one in 2005. She just looks cute in a red convertible.

What struck me about these three cars was how power was deployed. The 1985 model had roll-up windows and a manually operated roof. The ’95 had power windows and a manual roof. The ’05 has both power windows and a power roof. The 2005 model is clearly more energy efficient than the ’85 model but does it use less energy?

I didn’t know it at the time but I had struck on something called Jevons paradox (also known as the rebound effect). Named after William Stanley Jevons (pictured), a British economist in the mid-19th century, the paradox states that increasing energy efficiency leads to greater energy use. It’s basically a cost curve. Increased energy efficiency means each unit of energy costs less. As costs decline, we buy more energy — just as we do with most commodities.

The result: instead of raising and lowering the convertible’s roof with arm power, we put in a little motor to do it for us. According to Jevons, the net effect is that we use more energy rather than less.

I’ll admit that I’m a tree hugger and that I’m concerned about global warming. I also study the processes of innovation and I generally applaud innovations that result in greater energy efficiency. But the more I ponder Jevons paradox, I wonder if we’ve aimed our innovations at the wrong target. Shouldn’t we be creating innovations that help us use less energy rather than more?

(If you want to read more about Jevons paradox, The New Yorker has a terrific article here. On the other hand, Think Progress says the paradox exists in theory but not in practice. That article is here. Interesting reading).

 

My Social Media

YouTube Twitter Facebook LinkedIn

Newsletter Signup
Archives