Strategy. Innovation. Brand.

thinking logically

Critical Thinking Through the Ages

buddhaAs I’m teaching a course on critical thinking, I thought it would be useful to study the history of the concept. What have leading thinkers of the past conceived to be “critical thinking” and how have their perceptions changed over time?

One of the earliest — and most interesting –references that I’ve found is a sermon called the Kalama Sutta preached by the Buddha some five centuries before Christ. Often known as the “charter of free inquiry”, it lays out general tenets for discerning what is true.

Many religions hold that truth is revealed through scriptures or through institutions that are authorized to interpret scriptures. By contrast, Buddhism generally asserts that we have to ascertain truth for ourselves.  So, how do we do that?

That was essentially the question that the Kalama people asked the Buddha when he passed through their village of Kesaputta. The Buddha’s sermon emphasizes the need to question statements asserted to be true. Further, the Buddha goes on to list multiple sources of error and cautions us to carefully examine assertions from those sources.  According to Wikipedia, the Buddha identified the following sources of error:

  • Oral histories
  • Tradition
  • New sources
  • Scripture or other official documents
  • Supposition
  • Dogmatism
  • Common sense
  • Opinion
  • Experts
  • Authorities or one’s own teacher

Further, “Do not accept any doctrine from reverence, but first try it as gold is tried by fire.” The requires examination, reflection, and questioning and only that which is “conducive to the good” should be accepted as truth.

As Thanissaro Bhikkhu summarizes it, “any view or belief must be tested by the results it yields when put into practice; and — to guard against the possibility of any bias or limitations in one’s understanding of those results — they must further be checked against the experience of people who are wise.”

So how do the Buddhist commentaries compare to other philosophers? In the century after Buddha, Socrates is quoted as saying, “I know you won’t believe me, but the highest form of Human Excellence is to question oneself and others.” Almost 2,000 years later, Francis Bacon wrote, “Critical thinking is a desire to seek, patience to doubt, fondness to meditate, slowness to assert, readiness to consider, carefulness to dispose and set in order; and hatred for every kind of imposture.” A few hundred years later, Descartes wrote, “If you would be a real seeker after truth, it is necessary that at least once in your life you doubt, as far as possible, all things.” A hundred years after that, Voltaire wrote about the consequences of a failure of critical thinking, “Anyone who has the power to make you believe absurdities has the power to make you commit injustices.”

As the Swedes would say, there seems to be a “bright red thread” that ties all of these together. Go slowly. Ask questions. Be patient. Doubt your sources. Consider your own experience. Judge the evidence thoughtfully. For well over 2,000 years our philosophers — both Eastern and Western — have been saying essentially the same thing. It seems that we know what to do. Now all we have to do is to do it.

 

 

The Last of Thumb Thinking

Sure it’s dangerous. But we’re in control. No problem!

Heuristics are simply rules of thumb. They help us make decisions quickly and, in most cases, accurately. They help guide us through the day. Most often, we’re not even aware that we’re making decisions. Unfortunately, some of those decisions can go haywire — precisely because we’re operating on automatic pilot. In fact, psychologists suggest that we commonly make 17 errors via heuristics. I’ve surveyed 11 of them in previous posts (click here, here, and here). Let’s look at the last six today.

Optimistic bias — can get us into a lot of trouble. It leads us to conclude that we have much more control over dangerous situations than we actually do. We underestimate our risks. This can help us when we’re starting a new company; if we knew the real odds, we might never try. It can hurt us, however, when we’re trying to estimate the danger of cliff diving.

Hindsight bias — can also get us into trouble. Everything we did that was successful is by dint of our own hard work, talent, and skill. Everything we did that was unsuccessful was because of bad luck or someone else’s failures. Overconfidence, anyone?

Elimination by aspect — you’re considering multiple options and drop one of them because of a single issue. Often called, the “one-strike-and-you’re-out” heuristic. I think of it as the opposite of satsificing. With satsificing, you jump on the first solution that comes along. With elimination by aspect, you drop an option at the first sign of a problem. Either way, you’re making decisions too quickly.

Anchoring with adjustment — I call this the “first-impressions-never-die” heuristic and I worry about it when I’m grading papers. Let’s say I give Harry a C on his first paper. That becomes an anchor point. When his second paper arrives, I run the risk of simply adjusting upward or downward from the anchor point. If the second paper is outstanding, I might just conclude that it’s a fluke. But it’s equally logical to assume that the first paper was a fluke while the second paper is more typical of Harry’s work. Time to weigh anchor.

Stereotyping — we all know this one: to judge an entire group based on a single instance. I got in an accident and a student from the University of Denver went out of her way to help me out. Therefore, all University of Denver students must be altruistic and helpful. Or the flip side: I had a bad experience with Delta Airlines, therefore, I’ll never fly Delta again. It seems to me that we make more negative stereotyping mistakes than positive ones.

All or Nothing — the risk of something happening is fairly low. In fact, it’s so low that we don’t have to account for it in our planning. It’s probably not going to happen. We don’t need to prepare for that eventuality. When I cross the street, there’s a very low probability that I’ll get hit by a car. So why bother to look both ways?

As with my other posts on heuristics and systematics biases, I relied heavily on Peter Facione’s book, Think Critically, to prepare this post. You can find it here.

 

 

More Thinking on Your Thumbs

Power differential.

Remember heuristics? They’re the rules of thumb that allow us to make snap judgments, using System 1, our fast, automatic, and ever-on thinking system. They can also lead us into errors. According to psychologists there are least 17 errors that we commonly make. In previous articles, I’ve written about seven of them (click here and here). Let’s look at four more today.

Association — word association games are a lot of fun. (Actually, words are a lot of fun). But making associations and then drawing conclusions from them can get you into trouble. You say tomato and I think of the time I ate a tomato salad and got sick. I’m not going to do that again. That’s not good hygiene or good logic. The upside is that word associations can lead you to some creative thinking. You can make connections that you might otherwise have missed. And, as we all know, connections are the foundation of innovation. Just be careful about drawing conclusions.

Power differential — did you ever work for a boss with a powerful personality? Then you know something about this heuristic. Socially and politically, it may be easier to accept an argument made by a “superior authority” than it is to oppose it. It’s natural. We tend to defer to those who have more power or prestige than we do. Indeed, there’s an upside here as well. It’s called group harmony. Sometimes you do need to accept your spouse’s preferences even if they differ from yours. The trick is to recognize when preferences are merely a matter of taste versus preferences that can have significant negative results. As Thomas Jefferson said, “On matters of style, swim with the current. On matters of principle, stand like a rock”.

Illusion of control — how much control do you really have over processes and people at your office? It’s probably a lot less than you think. I’ve worked with executives who think they’ve solved a problem just because they’ve given one good speech. A good speech can help but it’s usually just one  step in a long chain of activities. Here’s a tip for spotting other people who have an illusion of control. They say I much more often than we. It’s poor communication and one of the worst mistakes you can make in a job interview. (Click here for more).

Loss and risk aversion — let’s just keep doing what we’re doing. Let’s not change things … we might be worse off. Why take risks? It happens that risk aversion has a much bigger influence on economic decisions than we once thought. In Thinking Fast and Slow, Daniel Kahneman writes about our unbalanced logic when considering gain versus loss — we fear loss more than we’re attracted by gain. In general terms, the pain of a loss is about double the pleasure of a gain. So, emotionally, it takes a $200 gain to balance a $100 loss. Making 2-to-1 decisions may be good for your nerves but it often means that you’ll pass up good economic opportunities.

To prepare this article, I drew primarily on Peter Facione’s Think Critically. (Click here) Daniel Kahneman’s book is here.

More Thumb Thinking

Us versus them.

Remember heuristics? They’re the rules of thumb that allow us to make snap judgments, using System 1, our fast, automatic, and ever-on thinking system. They can also lead us into errors. Last time I wrote about heuristics (click here), we looked at three of the 17 different error categories: satisficing, temporizing, and availability. Let’s look at four more today.

Affect — what’s your first response? What’s your initial impression? What does your gut tell you? These are all questions about your affect heuristic — more commonly known as gut feel. System 1 usually has the first word on a decision. If you let System 1 also have the last word on the decision, you’re making an affect-based decision. It may be a good decision — or maybe not. If you want to double check the accuracy of your affect, you need to fire up System 2. People with “poor impulse control” often stick with System 1 only and don’t engage System 2.

Simulation — if it’s easy to imagine a given outcome, then it’s more likely that outcome will occur, right? Not necessarily. At least in part, it depends on how good your imagination is. Salespeople can use simulation to very good effect: “Imagine how you would feel in this new suit.” “Don’t you think it would be great to drive a car like this?” “Imagine what other people will think of you when they see you on this motorcycle!” Simulation simply invokes your imagination. If it’s easy to imagine something, you may convince yourself that it’s actually going to happen. You could be right or you could be a victim of wishful thinking. Before you make a big decision, engage System 2.

Representation — “She looks like my ex-girlfriend. Therefore, she probably acts like my ex-girlfriend.” You notice that there’s a similarity between X and Y on one dimension. Therefore, you conclude that X and Y are similar on other dimensions as well. You’re letting one dimension represent other dimensions. This is essentially a poor analogy. The similarity in one dimension has nothing to do with similarities in other dimensions. Generally, the more profound a similarity is, the more likely it is to affect other dimensions. Physical appearance is not very profound. In fact, it’s apparently only skin deep.

Us versus Them — “The Republicans like this idea. Therefore, we have to hate it.” Unfortunately, we saw a lot of this in our recent elections. In fact, politics lends itself to the us versus them heuristic — because politics often boils down to a binary choice. Politics is also about belonging. I belong to this group and, therefore, I’m opposed to that group. This is often referred to as identity politics and is driven by demonstrative (as opposed to deliberative) speeches. In warfare, the us versus them heuristic may be good leadership. After all, you have to motivate your troops against a determined enemy. In politics, on the other hand, it smacks of manipulation. Time to fire up System 2. (For my article on demonstrative and deliberative speeches, click here).

Do you see yourself in any of these heuristics? Of course you do. All of us use heuristics and we use them pretty much every day. It’s how we manage “reality”. Unfortunately, they can also trick us into mistakes in logic and judgment. As you become more aware of these heuristics, you may want to engage System 2 more frequently.

To prepare this article, I drew primarily on Peter Facione’s Think Critically. (Click here)

Thinking: System 1 and System 2

Think Fast.

Do you know how you think? It’s both simpler and more complicated than you might imagine.

It turns out that we have not one but two thinking systems. One is fast, automatic, and doesn’t require much energy. The other is slow, requires a lot of energy, and activates only when needed. Both systems are naturally good at grammar. Neither system is good at statistics.

Why do we need two systems? Because much of our life is filled with routine. Rather than using our slow, energy-hungry system to deal with routine matters, we’ve developed a fast, energy-efficient system to handle daily activities. That’s why we can drive 50 miles and not remember any of it. That’s why we can enjoy a walk in the park while our mind is in some other universe — being creative no doubt.

Notice, however, that what’s routine to one person is exotic and complicated to another. What’s routine to an airline pilot would be complicated, confusing, and downright scary to me. We train our fast, energy efficient system with our experiences. As we gain experience, more things become routine. We can do more things on auto-pilot, while turning over the creative thinking to our energy-hungry system. That may be why Earl Weaver, former manager of the Baltimore Orioles, titled his autobiography, It’s What You Learn After You Know It All That Counts.

Psychologists have named our two thinking systems — rather prosaically — System 1 and System 2. System 1 is fast and always on. You can’t tun it off. System 2 engages at various times — especially when System 1 encounters something out of the ordinary.

System 1 knows the “default” value — if everything is routine, just select the default action. To select something other than the default value, you typically need to fire up System 2. In Thinking Fast and Slow, Daniel Kahneman tells a story about parole boards in Israel. For parole judges, the default value is to deny parole. Judges have to find something positive and think through their position to approve parole. As a result, you’re much more likely to be approved for parole if your case is heard right after lunch. Immediately after eating, the judges have a lot of fuel for their brains and find it much easier to activate System 2. Thus, it’s easier to override the default position.

While System 1 is critical to our daily living, it’s also prone to specific types of errors. Indeed, psychologists have cataloged 17 different classes of System 1 errors.  As we probe more deeply into critical thinking, I’ll provide an overview of all 17 and will delve more deeply into a few of the more common issues. Each time I review the list, I can recall a whole host of errors that I’ve made. Frankly, I’ll probably continue to make similar errors in the future. By understanding the types of errors I might make, however, I can check myself and maybe activate System 2 more frequently. As you read through the 17 types of System 1 errors, think about your own experiences. If you have good examples, please share them.

I drew primarily on two sources for composing this article. First, is Daniel Kahneman’s Thinking Fast and Slow. (Click here). Second, is Peter Facione’s Think Critically. (Click here)

 

My Social Media

YouTube Twitter Facebook LinkedIn

Newsletter Signup
Archives