Strategy. Innovation. Brand.

thinking clearly

Questions, Proxies, and Health

When faced with a difficult question, we often substitute a simpler question and answer that instead. Here are three examples:

  • What’s the crime rate in your neighborhood? – We probably don’t keep close tabs on the crime rate. We’re not naturally good at statistics either so it’s difficult to develop a reasonable estimate. So we substitute a different question: How easy is it for me to remember a crime committed in my neighborhood? If it’s easy to remember, we guess that the neighborhood has a high crime rate. If it’s hard to remember, we guess a lower rate. (I’ve adapted this example from Daniel Kahneman’s book, Thinking Fast and Slow).
  • How’s your car running? – It’s hard to know how well a car is running in this age of sophisticated electronics. So we answer a different question: How does the car sound? If it’s not making strange noises – knocks and pings – it must be running well and performing optimally.
  • How effective is your shampoo? – I suppose we could study our hair’s health every day but most of us don’t. So we answer a simpler question: How much lather does your shampoo produce? If we get a lot of lather, the shampoo must be effective.
How many do I need to get 10 calories of energy?

How many do I need to get 10 calories of energy?

In each case, we substitute a proxy for the original question. We assume that the proxy measures the same thing that the original question aimed to measure. Sometimes we’re right; sometimes we’re wrong. Most often, we don’t think about the fact that we’re using a proxy. System 1 does the thinking for us. But we can, in fact, bring the proxy to System 2 and evaluate whether it’s effective or not. If we think about it, we can use System 2 to spot errors in System 1. But we have to think about it.

As it happens, System 1 uses proxies in some situations that we might never think about. Here’s an example: How much food should you eat?

(The following is based on a study from the University of Sydney. The research article is here. Less technical summaries are here and here).

We tend to think of food in terms of quantity. System 1 also considers food as a source of energy. System 1 is trying to answer two questions: 1) How much energy does my body need? 2) How much food does that translate to?

Our bodies have learned that sweet food delivers more energy than non-sweet food and can use this to translate from energy needs to food requirements. Let’s say that the equation looks something like this:

1 calorie* of energy is generated by 10 grams of sweet food

Let’s also assume that our body has determined that we need 10 calories of energy. A simple calculation indicates that we need to eat 100 grams of sweet food. Once we’ve eaten 100 grams, System 1 can issue a directive to stop eating.

Now let’s change the scenario by introducing artificial sweeteners that add sweetness without adding many calories. The new translation table might look like this:

1 calorie of energy is generated by 30 grams of artificially sweetened food

If we still need 10 calories of energy, we will need to eat 300 grams of artificially sweetened food. System 1 issues a directive to stop only after we’ve eaten the requisite amount.

System 1 can’t tell the difference between artificially and naturally sweetened foods. It has only one translation table. If we eat a lot of artificially sweetened food, System 1 will learn the new translation table. If we then switch back to naturally sweetened foods, System 1 will still use the new translation table. It will still tell us to eat 300 grams of food to get 10 calories of energy.

We would never know that our brain makes energy/quantity assumptions if not for studies like this one. It’s not intuitively obvious that we need to invoke System 2 to examine the relationship between artificial sweeteners and food intake. But like crime rates or cars or shampoos, we often answer different questions than we think we’re answering. To think more clearly, we need to examine our proxies more carefully.

*It’s actually a kilocalorie of energy but we Americans refer to it as a calorie.

Thinking Under Pressure

Thinking is hard.

Thinking is hard.

Thinking is hard. It’s even harder when you’re under pressure. Stress lowers your IQ. When your boss is yelling at you, and your ears are pinned back, it’s hard to remember to think rationally. It’s hard to think at all – mainly you just react.

So, I always encourage my students to keep several go-to questions in their heads. These are simple, memorable questions that are always available. You can go to them quickly in an emergency. Why would you go to them? Perhaps you want to clarify the situation. Maybe you need more information. Or maybe, just maybe, you need to buy a little time.

In class the other night, I asked my students to write down their best go-to questions. They had been thinking about critical thinking for seven weeks so I assumed that they had some pretty good questions on the tips of their tongues. I was right.

I looked over the questions and realized that they fell naturally into five categories. Here are the categories with the most frequently asked questions. You might want to carry some of them around with you.

1) Gaining Self-Control – first things first: you can’t manage a situation if you can’t manage yourself. My students focused first on assessing their own situation, with questions like these:

Am I breathing effectively?

What’s my posture like? How can I change my posture to present myself more effectively?

What am I feeling right now? Are my feelings rational?

How can I engage my thinking function?

What is the other person’s purpose? Why is he behaving this way?

2) Clarifying the facts – once you’ve calmed yourself and cleared your head, you’ll want to establish what’s actually happening, with questions like these:

What are the facts? How do we know they’re facts? How were they verified?

Where did the information come from? Was the source credible?

What are our assumptions? Are they reasonable?

How did we get from the facts to the conclusions? Were there any logical fallacies along the way?

Why is this important? How does it compare in importance to Topic X or Topic Y?

Who wants to know? What is her purpose?

3) Clarifying the other person’s position – the information you have may be accurate but you also need to make sure you understand the other person’s position regarding the information. Here are some useful questions:

What’s your take on this? How do you see this?

Why do you say that? What makes you believe that?

Can you explain it in a different way?

What does “xyz” mean to you? How do you define it?

Can you paint me a picture of what you’re seeing?

Why are you so upset?

4) Clarifying the decision – you now know the “facts” and the other person’s interpretation of the facts, but you still need to figure out what decision you’re trying to make.

What outcome do we want? What’s our goal? Why?

What outcomes are possible? Which one(s) seem most fair?

Is there more than one solution? Are we trapping ourselves in a whether-or-not decision?

What if the outcome we want is not possible? Then what will we do? Is there another outcome that we might aim for?

What’s the timeframe? Are we thinking short-term or long-term?

Who else do we need to include?

How will we know when/if we need to re-visit the decision?

5) Fixing the process – when a problem arises, most organizations aim to fix the problem. They often forget to investigate the process that created the problem. Don’t forget these questions. They may well be the most important. But don’t aim for blame; this is a good time for appreciative inquiry.

How did we get here?

How can we improve our decision-making process to avoid this in the future?

What were the root causes?

How could we make a better decision in the future?

 

 

More Thumb Thinking

Us versus them.

Remember heuristics? They’re the rules of thumb that allow us to make snap judgments, using System 1, our fast, automatic, and ever-on thinking system. They can also lead us into errors. Last time I wrote about heuristics (click here), we looked at three of the 17 different error categories: satisficing, temporizing, and availability. Let’s look at four more today.

Affect — what’s your first response? What’s your initial impression? What does your gut tell you? These are all questions about your affect heuristic — more commonly known as gut feel. System 1 usually has the first word on a decision. If you let System 1 also have the last word on the decision, you’re making an affect-based decision. It may be a good decision — or maybe not. If you want to double check the accuracy of your affect, you need to fire up System 2. People with “poor impulse control” often stick with System 1 only and don’t engage System 2.

Simulation — if it’s easy to imagine a given outcome, then it’s more likely that outcome will occur, right? Not necessarily. At least in part, it depends on how good your imagination is. Salespeople can use simulation to very good effect: “Imagine how you would feel in this new suit.” “Don’t you think it would be great to drive a car like this?” “Imagine what other people will think of you when they see you on this motorcycle!” Simulation simply invokes your imagination. If it’s easy to imagine something, you may convince yourself that it’s actually going to happen. You could be right or you could be a victim of wishful thinking. Before you make a big decision, engage System 2.

Representation — “She looks like my ex-girlfriend. Therefore, she probably acts like my ex-girlfriend.” You notice that there’s a similarity between X and Y on one dimension. Therefore, you conclude that X and Y are similar on other dimensions as well. You’re letting one dimension represent other dimensions. This is essentially a poor analogy. The similarity in one dimension has nothing to do with similarities in other dimensions. Generally, the more profound a similarity is, the more likely it is to affect other dimensions. Physical appearance is not very profound. In fact, it’s apparently only skin deep.

Us versus Them — “The Republicans like this idea. Therefore, we have to hate it.” Unfortunately, we saw a lot of this in our recent elections. In fact, politics lends itself to the us versus them heuristic — because politics often boils down to a binary choice. Politics is also about belonging. I belong to this group and, therefore, I’m opposed to that group. This is often referred to as identity politics and is driven by demonstrative (as opposed to deliberative) speeches. In warfare, the us versus them heuristic may be good leadership. After all, you have to motivate your troops against a determined enemy. In politics, on the other hand, it smacks of manipulation. Time to fire up System 2. (For my article on demonstrative and deliberative speeches, click here).

Do you see yourself in any of these heuristics? Of course you do. All of us use heuristics and we use them pretty much every day. It’s how we manage “reality”. Unfortunately, they can also trick us into mistakes in logic and judgment. As you become more aware of these heuristics, you may want to engage System 2 more frequently.

To prepare this article, I drew primarily on Peter Facione’s Think Critically. (Click here)

Thinking With Your Thumbs – Part 1

Do you bite your thumb at me, sir?

Can we think with our thumbs? Well, metaphorically we do. When we use System 1 — our fast, automatic, energy-efficient thinking system — we use heuristics, shortcuts to get to an answer that is “good enough”. We often refer to heuristics as rules of thumb — rough and ready ways to deal with reality. (For a comparison of System 1 versus System 2, click here).

Our rules of thumb work most of the time but not all of the time. Psychologists have classified 17 different errors that we make when we use System 1. Let’s look at three today.

Satisficing and temporizing are two errors that often go hand in hand. Satisficing simply means that when we find a choice that’s good enough, we take it and don’t search any farther. (The definition of “good enough” is entirely up to you.) Defense lawyers regularly accuse the police of satsificing. The accusation goes something like this: “You found my client and decided that he committed the crime. You stopped looking for any other suspects. You let the real criminal get away.”

Temporizing is similar to satisficing but adds a time dimension. You’re temporizing when you choose an option that’s good enough for now. How much education do you need? Well, let’s say that you can get a good job immediately with only a Bachelor’s degree. It’s good enough for now. But, 20 years from now you may not be able to get the promotion you want because you don’t have a Master’s degree. You may regret that you temporized in your younger years.

If you ever hear someone say, “if it ain’t broke, don’t fix it” you may well conclude that they’re either satisficing or temporizing. Whatever “it” is, it’s good enough for now.

Availability is another error category that we encounter often. When we’re asked a difficult question, we often search our memory banks for cases that would help us develop an answer. If we can recall cases easily, we tend to overestimate the probability that the same phenomenon will occur again. In other words, if the cases are readily available (to our memory), we tend to exaggerate their probability. This is especially true with vivid memories. This is one reason that people tend to overestimate the crime rate in their communities. Recent crimes are readily recalled — you read about them in the papers every day. Gruesome crimes create vivid memories — thus, many people think that gruesome crimes occur far more frequently than they do.

Available memories don’t have to be recent. In fact, vivid memories can last for years and affect our judgment and behavior in subtle ways. Indeed, I still go easy on tequila because of vivid memories from college days.

Satsificing, temporizing, and availability are three rules of thumb that help us get through the day. They’re part of System 1 which we can’t turn off, so we’re always vulnerable to these types of errors. In general, the benefits of System 1 outweigh the costs but you should be aware of the costs. If the costs are getting out of hand, it’s time to switch on System 2.

I drew primarily on two sources for composing this article. First, Peter Facione’s Think Critically. (Click here)  Second, Daniel Kahneman’s Thinking Fast and Slow. (Click here).

My Social Media

YouTube Twitter Facebook LinkedIn

Newsletter Signup
Archives