Strategy. Innovation. Brand.

Critical Thinking

Christmas – Deadliest Day of the Year

Drop by any time.

Here’s a little exercise in critical thinking. More people in the United States die on Christmas Day than on any other day of the year. It’s our deadliest day. New Year’s Day is the second deadliest day.

The first question a critical thinker would ask is: Is that really true? Here’s the evidence: an article published in Social Science and Medicine in October 2010, that analyzed 54 million death certificates from 1979 through 2004. (Click here for the abstract and charts; the full-text is behind a pay wall). Regardless of the setting or the cause, the number of deaths clearly peaks on Christmas Day. This affects all demographic groups, except children.

The next question a critical thinker might ask is: Why would that be? Here the logic gets a little fuzzy. As the authors of the research paper point out, they tested nine different hypotheses but believe more research is necessary. So let’s think about it a bit.

  • Hypothesis 1: Perhaps it’s because people overeat on Christmas Day, overloading the digestive system, causing systemic stress and death. Really? One big meal causes death? If that’s the case, many of us would be long gone already.
  • Hypothesis 2: It’s the stress of having all those family members and in-laws around. True, that’s a lot of stress but a lot of other holidays cause stress as well. If that’s the case, why wouldn’t we also see spikes on Thanksgiving or July 4th?
  • Hypothesis 3: maybe sick people hang on until Christmas and then let go. It’s possible — people can and do keep themselves alive until a big event. But that doesn’t explain why mortality rises in the days and weeks before Christmas. If people were hanging on, you would expect to see a dip in deaths just before Christmas.

The hypothesis I like — which I spotted in the Daily Beast (click here) — is that Christmas isn’t abnormal in terms of life-threatening incidents, but is abnormal in the way people behave when a life-threatening incident occurs. If you feel chest pains on any random day, you may just head straight for the hospital. That’s a good idea because the sooner you get there, the better your chances of survival. On Christmas, however, people may delay, not wanting to spoil the festive atmosphere or leave the family on a celebration day. They may also believe that they’ll get poor service at the hospital on Christmas. The hospital will likely be understaffed or staffed by second stringers, etc. Better to wait ’til tomorrow to get better service.

The next question a critical thinker might ask is: If this is true, what should we do about it, if anything? This hypothesis, of course, is not fully tested. We can’t claim conclusively that it’s true. But there is a certain logic about it. Perhaps enough that we can make Pascal’s wager — the evidence isn’t conclusive but it’s strong enough to make a bet. If we’re wrong we don’t lose much. If we’re right, we can save a lot of lives. So, what do we do? Perhaps we can advertise the phenomena and encourage people to get to the hospital quickly, even if it is Christmas. In fact, consider this article a public service announcement. If you have chest pains today, get your butt to the hospital pronto!

Merry Christmas!

Thinking With Your Thumbs – Part 1

Do you bite your thumb at me, sir?

Can we think with our thumbs? Well, metaphorically we do. When we use System 1 — our fast, automatic, energy-efficient thinking system — we use heuristics, shortcuts to get to an answer that is “good enough”. We often refer to heuristics as rules of thumb — rough and ready ways to deal with reality. (For a comparison of System 1 versus System 2, click here).

Our rules of thumb work most of the time but not all of the time. Psychologists have classified 17 different errors that we make when we use System 1. Let’s look at three today.

Satisficing and temporizing are two errors that often go hand in hand. Satisficing simply means that when we find a choice that’s good enough, we take it and don’t search any farther. (The definition of “good enough” is entirely up to you.) Defense lawyers regularly accuse the police of satsificing. The accusation goes something like this: “You found my client and decided that he committed the crime. You stopped looking for any other suspects. You let the real criminal get away.”

Temporizing is similar to satisficing but adds a time dimension. You’re temporizing when you choose an option that’s good enough for now. How much education do you need? Well, let’s say that you can get a good job immediately with only a Bachelor’s degree. It’s good enough for now. But, 20 years from now you may not be able to get the promotion you want because you don’t have a Master’s degree. You may regret that you temporized in your younger years.

If you ever hear someone say, “if it ain’t broke, don’t fix it” you may well conclude that they’re either satisficing or temporizing. Whatever “it” is, it’s good enough for now.

Availability is another error category that we encounter often. When we’re asked a difficult question, we often search our memory banks for cases that would help us develop an answer. If we can recall cases easily, we tend to overestimate the probability that the same phenomenon will occur again. In other words, if the cases are readily available (to our memory), we tend to exaggerate their probability. This is especially true with vivid memories. This is one reason that people tend to overestimate the crime rate in their communities. Recent crimes are readily recalled — you read about them in the papers every day. Gruesome crimes create vivid memories — thus, many people think that gruesome crimes occur far more frequently than they do.

Available memories don’t have to be recent. In fact, vivid memories can last for years and affect our judgment and behavior in subtle ways. Indeed, I still go easy on tequila because of vivid memories from college days.

Satsificing, temporizing, and availability are three rules of thumb that help us get through the day. They’re part of System 1 which we can’t turn off, so we’re always vulnerable to these types of errors. In general, the benefits of System 1 outweigh the costs but you should be aware of the costs. If the costs are getting out of hand, it’s time to switch on System 2.

I drew primarily on two sources for composing this article. First, Peter Facione’s Think Critically. (Click here)  Second, Daniel Kahneman’s Thinking Fast and Slow. (Click here).

Thinking: System 1 and System 2

Think Fast.

Do you know how you think? It’s both simpler and more complicated than you might imagine.

It turns out that we have not one but two thinking systems. One is fast, automatic, and doesn’t require much energy. The other is slow, requires a lot of energy, and activates only when needed. Both systems are naturally good at grammar. Neither system is good at statistics.

Why do we need two systems? Because much of our life is filled with routine. Rather than using our slow, energy-hungry system to deal with routine matters, we’ve developed a fast, energy-efficient system to handle daily activities. That’s why we can drive 50 miles and not remember any of it. That’s why we can enjoy a walk in the park while our mind is in some other universe — being creative no doubt.

Notice, however, that what’s routine to one person is exotic and complicated to another. What’s routine to an airline pilot would be complicated, confusing, and downright scary to me. We train our fast, energy efficient system with our experiences. As we gain experience, more things become routine. We can do more things on auto-pilot, while turning over the creative thinking to our energy-hungry system. That may be why Earl Weaver, former manager of the Baltimore Orioles, titled his autobiography, It’s What You Learn After You Know It All That Counts.

Psychologists have named our two thinking systems — rather prosaically — System 1 and System 2. System 1 is fast and always on. You can’t tun it off. System 2 engages at various times — especially when System 1 encounters something out of the ordinary.

System 1 knows the “default” value — if everything is routine, just select the default action. To select something other than the default value, you typically need to fire up System 2. In Thinking Fast and Slow, Daniel Kahneman tells a story about parole boards in Israel. For parole judges, the default value is to deny parole. Judges have to find something positive and think through their position to approve parole. As a result, you’re much more likely to be approved for parole if your case is heard right after lunch. Immediately after eating, the judges have a lot of fuel for their brains and find it much easier to activate System 2. Thus, it’s easier to override the default position.

While System 1 is critical to our daily living, it’s also prone to specific types of errors. Indeed, psychologists have cataloged 17 different classes of System 1 errors.  As we probe more deeply into critical thinking, I’ll provide an overview of all 17 and will delve more deeply into a few of the more common issues. Each time I review the list, I can recall a whole host of errors that I’ve made. Frankly, I’ll probably continue to make similar errors in the future. By understanding the types of errors I might make, however, I can check myself and maybe activate System 2 more frequently. As you read through the 17 types of System 1 errors, think about your own experiences. If you have good examples, please share them.

I drew primarily on two sources for composing this article. First, is Daniel Kahneman’s Thinking Fast and Slow. (Click here). Second, is Peter Facione’s Think Critically. (Click here)

 

As we become more energy efficient, do we use less energy?

We buy my wife, Suellen, a new car every ten years whether she needs one or not. It’s always a red Volkswagen convertible with a manual transmission. (She’s old school). We bought the first one in 1985, another one in 1995, and the current one in 2005. She just looks cute in a red convertible.

What struck me about these three cars was how power was deployed. The 1985 model had roll-up windows and a manually operated roof. The ’95 had power windows and a manual roof. The ’05 has both power windows and a power roof. The 2005 model is clearly more energy efficient than the ’85 model but does it use less energy?

I didn’t know it at the time but I had struck on something called Jevons paradox (also known as the rebound effect). Named after William Stanley Jevons (pictured), a British economist in the mid-19th century, the paradox states that increasing energy efficiency leads to greater energy use. It’s basically a cost curve. Increased energy efficiency means each unit of energy costs less. As costs decline, we buy more energy — just as we do with most commodities.

The result: instead of raising and lowering the convertible’s roof with arm power, we put in a little motor to do it for us. According to Jevons, the net effect is that we use more energy rather than less.

I’ll admit that I’m a tree hugger and that I’m concerned about global warming. I also study the processes of innovation and I generally applaud innovations that result in greater energy efficiency. But the more I ponder Jevons paradox, I wonder if we’ve aimed our innovations at the wrong target. Shouldn’t we be creating innovations that help us use less energy rather than more?

(If you want to read more about Jevons paradox, The New Yorker has a terrific article here. On the other hand, Think Progress says the paradox exists in theory but not in practice. That article is here. Interesting reading).

 

The Art of the Wrong View

In one of my classes at the University of Denver, I try to teach my students how to manage technologies that constantly morph and change. They’re unpredictable, they’re slippery, and managing them effectively can make the difference between success and failure.

The students, of course, want to predict the future so they can prepare for it.  I try to convince them that predicting the future is impossible. But they’re young. They can explain the past, so why can’t they predict the future?

To help them prepare for the future — though not predict it — I often teach the techniques of scenario planning. You tell structured stories about the future and then work through them logically to understand which way the world might tilt. The technique has common building blocks, often referred to as PESTLE.  Your stories need to incorporate political, economic, societal, technical, legal, and environmental frameworks. This helps ensure that you don’t overlook anything.

I’ve used scenario planning a number of times and it has always helped me think through situations in creative ways – so it seems reasonable to teach it. To prepare for a recent class, I re-read The Art of the Long View by Peter Schwartz. I found it on one of my dustier bookshelves and discovered it was the 1991 edition.  While I remembered many of the main points, I was surprised to find a long chapter titled, “The World in 2005: Three Scenarios”. Here was a chance to see how well the inventor of scenario planning could prepare us for the future.

In sum, I was quite disappointed. The main error was that each scenario vastly overestimated the importance of Japan on the world stage in 2005. In a way, it all makes sense. The author was writing in 1991, when we all believed that Japan might just surpass every other economy on earth. Of course, he would assume that Japan would still dominate in 2005. Of course, he was wrong.

So what can we learn from this?  Two things, I think:

  1. Always remember to ask the reverse question. If it’s “obvious” that a trend will continue (e.g. Japan will dominate) always remember to ask the non-obvious question: what if it doesn’t? Today, it seems obvious that health care costs will continue to rise for the foreseeable future. But what if we make a medical breakthrough and costs plummet?
  2. Remember that resilience is better than prediction. We’ll never be able to predict the future — partially because we can’t really explain the past. But we can be prepared by building flexible systems that can respond to unexpected jolts. The human immune is probably a good model. This means building systems and organizations where information flows easily, creativity is valued, and leaders can emerge from anywhere.

I’ll continue to teach scenario planning in the future. After all, it’s a good template for thinking and planning. I’ll also be able to provide a very good example of how it can all go wrong.

My Social Media

YouTube Twitter Facebook LinkedIn

Newsletter Signup
Archives