As I’m teaching a course on critical thinking, I thought it would be useful to study the history of the concept. What have leading thinkers of the past conceived to be “critical thinking” and how have their perceptions changed over time?
One of the earliest — and most interesting –references that I’ve found is a sermon called the Kalama Sutta preached by the Buddha some five centuries before Christ. Often known as the “charter of free inquiry”, it lays out general tenets for discerning what is true.
Many religions hold that truth is revealed through scriptures or through institutions that are authorized to interpret scriptures. By contrast, Buddhism generally asserts that we have to ascertain truth for ourselves. So, how do we do that?
That was essentially the question that the Kalama people asked the Buddha when he passed through their village of Kesaputta. The Buddha’s sermon emphasizes the need to question statements asserted to be true. Further, the Buddha goes on to list multiple sources of error and cautions us to carefully examine assertions from those sources. According to Wikipedia, the Buddha identified the following sources of error:
Further, “Do not accept any doctrine from reverence, but first try it as gold is tried by fire.” The requires examination, reflection, and questioning and only that which is “conducive to the good” should be accepted as truth.
As Thanissaro Bhikkhu summarizes it, “any view or belief must be tested by the results it yields when put into practice; and — to guard against the possibility of any bias or limitations in one’s understanding of those results — they must further be checked against the experience of people who are wise.”
So how do the Buddhist commentaries compare to other philosophers? In the century after Buddha, Socrates is quoted as saying, “I know you won’t believe me, but the highest form of Human Excellence is to question oneself and others.” Almost 2,000 years later, Francis Bacon wrote, “Critical thinking is a desire to seek, patience to doubt, fondness to meditate, slowness to assert, readiness to consider, carefulness to dispose and set in order; and hatred for every kind of imposture.” A few hundred years later, Descartes wrote, “If you would be a real seeker after truth, it is necessary that at least once in your life you doubt, as far as possible, all things.” A hundred years after that, Voltaire wrote about the consequences of a failure of critical thinking, “Anyone who has the power to make you believe absurdities has the power to make you commit injustices.”
As the Swedes would say, there seems to be a “bright red thread” that ties all of these together. Go slowly. Ask questions. Be patient. Doubt your sources. Consider your own experience. Judge the evidence thoughtfully. For well over 2,000 years our philosophers — both Eastern and Western — have been saying essentially the same thing. It seems that we know what to do. Now all we have to do is to do it.
In his book, Thinking Fast and Slow, Daniel Kahneman has an interesting example of a heuristic bias. Read the description, then answer the question.
Steve is very shy and withdrawn, invariably helpful but with little interest in people or in the world of reality. A meek and tidy soul, he has a need for order and structure, and a passion for detail.
Is Steve more likely to be a librarian or a farmer?
I used this example in my critical thinking class the other night. About two-thirds of the students guessed that Steve is a librarian; one-third said he’s a farmer. As we debated Steve’s profession, the class focused exclusively on the information in the simple description.
Kahneman’s example illustrates two problems with the rules of thumb (heuristics) that are often associated with our System 1 thinking. The first is simply stereotyping. The description fits our widely held stereotype of male librarians. It’s easy to conclude that Steve fits the stereotype. Therefore, he must be a librarian.
The second problem is more subtle — what evidence do we use to draw a conclusion? In the class, no one asked for additional information. (This is partially because I encouraged them to reach a decision quickly. They did what their teacher asked them to do. Not always a good idea.) Rather they used the information that was available. This is often known as the availability bias — we make a decision based on the information that’s readily available to us. As it happens, male farmers in the United States outnumber male librarians by a ratio of about 20 to 1. If my students had asked about this, they might have concluded that Steve is probably a farmer — statistically at least.
The availability bias can get you into big trouble in business. To illustrate, I’ll draw on an example (somewhat paraphrased) from Paul Nutt’s book, Why Decisions Fail.
Peca Products is locked in a fierce competitive battle with its archrival, Frangro Enterprises. Peca has lost 4% market share over the past three quarters. Frangro has added 4% in the same period. A board member at Peca — a seasoned and respected business veteran — grows alarmed and concludes that Peca has a quality problem. She sends memos to the executive team saying, “We have to solve our quality problem and we have to do it now!” The executive team starts chasing down the quality issues.
The Peca Products executive team is falling into the availability trap. Because someone who is known to be smart and savvy and experienced says the company has a quality problem, the executives believe that the company has a quality problem. But what if it’s a customer service problem? Or a logistics problem? Peca’s executives may well be solving exactly the wrong problem. No one stopped to ask for additional information. Rather, they relied on the available information. After all, it came from a trusted source.
So, what to do? The first thing to remember in making any significant decision is to ask questions. It’s not enough to ask questions about the information you have. You also need to seek out additional information. Questioning also allows you to challenge a superior in a politically acceptable manner. Rather than saying “you’re wrong!” (and maybe getting fired), you can ask, “Why do you think that? What leads you to believe that we have a quality problem?” Proverbs says that “a gentle answer turneth away wrath”. So does an insightful question.
When I’m happy, I smile. A few years ago, I discovered that the reverse is also true: when I smile, I get happy. I’ve also found that standing up straight — as opposed to slouching — can improve my mood and perhaps even my productivity.
It turns out that I was on to something much bigger than I realized at the time. There’s increasing evidence that we think with our bodies as well as our minds. Some of the evidence is simply the way we express our thoughts through physical metaphors. For instance, “I’m in over my head”, “I’m up to my neck in trouble”, “He’s my right hand man”, and so on. Because we use bodily metaphors to express our mental condition, the field of “body thinking” is often referred to a metaphor theory. Perhaps more generally, it’s called embodied cognition.
In experiments reported in New Scientist and in Scientific American, we can see some interesting effects. For instance, with people from western cultures, “up” and “right” mean bigger or better while “down” or “left” mean smaller or worse. So, for instance, when volunteers were asked to call out random numbers to the beat of a metronome, their eyes moved upward when the next number was larger and moved downward when the next number was smaller. Similarly, volunteers were asked to gauge the number of objects in a container. When they were asked to lean to the left while formulating their estimate, they guessed smaller numbers on average. When leaning to the right, they guessed larger numbers.
In another experiment, volunteers were asked to move boxes either: 1) from a lower shelf to a higher shelf; or 2) from a higher shelf to a lower shelf. While moving the boxes, they were asked a simple, open-ended question, like: What were you doing yesterday? Those who were moving the boxes upward were more likely to report positive experiences. Those who were moving the boxes downward were more likely to report negative experiences.
When I speak of the future, I often gesture forward — the future is ahead of us. When I speak of the past, I often do the opposite, gesturing back over my shoulder. Interestingly, the Aymara Indians of highlands Bolivia are reported to do the opposite — pointing backward to the future and forward to the past. That seems counter-intuitive to me but the Aymara explain that the future is invisible. You can’t see it. It’s as if it’s behind you where it can quite literally sneak up on you. The past, on the other hand, is entirely visible — it’s spread out in front of you. That makes a lot of sense but my cultural training is so deeply embedded that I find it very hard to point backward to the future. It’s an interesting example of cultural differences influencing embodied cognition.
Where does this leave us? Well, it’s certainly a step away from Descartes’ formulation of the mind-body duality. The body is not simply a sensing instrument that feeds data to the mind. It also feed thoughts and influences our moods in subtle ways. Yet another reason to take good care of your body.
Heuristics are simply rules of thumb. They help us make decisions quickly and, in most cases, accurately. They help guide us through the day. Most often, we’re not even aware that we’re making decisions. Unfortunately, some of those decisions can go haywire — precisely because we’re operating on automatic pilot. In fact, psychologists suggest that we commonly make 17 errors via heuristics. I’ve surveyed 11 of them in previous posts (click here, here, and here). Let’s look at the last six today.
Optimistic bias — can get us into a lot of trouble. It leads us to conclude that we have much more control over dangerous situations than we actually do. We underestimate our risks. This can help us when we’re starting a new company; if we knew the real odds, we might never try. It can hurt us, however, when we’re trying to estimate the danger of cliff diving.
Hindsight bias — can also get us into trouble. Everything we did that was successful is by dint of our own hard work, talent, and skill. Everything we did that was unsuccessful was because of bad luck or someone else’s failures. Overconfidence, anyone?
Elimination by aspect — you’re considering multiple options and drop one of them because of a single issue. Often called, the “one-strike-and-you’re-out” heuristic. I think of it as the opposite of satsificing. With satsificing, you jump on the first solution that comes along. With elimination by aspect, you drop an option at the first sign of a problem. Either way, you’re making decisions too quickly.
Anchoring with adjustment — I call this the “first-impressions-never-die” heuristic and I worry about it when I’m grading papers. Let’s say I give Harry a C on his first paper. That becomes an anchor point. When his second paper arrives, I run the risk of simply adjusting upward or downward from the anchor point. If the second paper is outstanding, I might just conclude that it’s a fluke. But it’s equally logical to assume that the first paper was a fluke while the second paper is more typical of Harry’s work. Time to weigh anchor.
Stereotyping — we all know this one: to judge an entire group based on a single instance. I got in an accident and a student from the University of Denver went out of her way to help me out. Therefore, all University of Denver students must be altruistic and helpful. Or the flip side: I had a bad experience with Delta Airlines, therefore, I’ll never fly Delta again. It seems to me that we make more negative stereotyping mistakes than positive ones.
All or Nothing — the risk of something happening is fairly low. In fact, it’s so low that we don’t have to account for it in our planning. It’s probably not going to happen. We don’t need to prepare for that eventuality. When I cross the street, there’s a very low probability that I’ll get hit by a car. So why bother to look both ways?
As with my other posts on heuristics and systematics biases, I relied heavily on Peter Facione’s book, Think Critically, to prepare this post. You can find it here.
What’s a debacle? According to Paul Nutt, it’s “…merely a botched decision that attracts attention and gets a public airing.” Nutt goes on to write that his “…research shows that half of the decisions made in business and related organizations fail.” Actually, it may be higher because, “failed decisions that avoid a public airing are apt to be covered up.”
Remember that I wrote not long ago (click here) that perhaps 70% of change management efforts fail? Now we learn that half — or more — of all business decisions fail. We’re not doing so well. Nutt has studied over 400 debacles — botched decisions that became public disasters — and has created an anatomy of why and how they happen. Nutt’s book, Why Decisions Fail, is a sobering look at how we manage our organizations and, more specifically, our decisions.
Critical thinking should help us avoid botched decisions and public debacles. I’ll be writing about critical thinking over the next several months and, from time to time, will pull ideas from Nutt’s book. Today, let’s set the stage by looking at the basics. Nutt writes that blunders happen because of three broad reasons:
Nutt also criticizes contingency theory — the idea that your situation dictates your tactics. For instance, if you’re faced with a community boycott, you should do X; if you’re faced with cost overruns, you should do Y. Nutt concludes that, “Best practices can be followed regardless of the decision to be made or the circumstances surrounding it.” The bulk of his book outlines what those best practices are.
Of course, there’s a lot more to it. I’ll outline the highlights in future posts and put Nutt’s findings in the general context of critical thinking. I hope you’ll follow along. In the meantime, don’t make any premature commitments.