It’s hard to think critically when you don’t know what you’re missing. As we think about improving our thinking, we need to account for two things that are so subtle that we don’t fully recognize them:
Because of assumptions and filters, we often talk past each other. The world is a confusing place and becomes even more confusing when our perception of what’s “out there” is unique. How can we overcome these effects? We need to consider two sets of questions:
The more we study assumptions and filters, the more attuned we become to their prevalence. When we make a decision, we’ll remember to inquire abut ourselves before we inquire about the world around us. That will lead us to better decisions.
In my critical thinking class, we begin by studying 17 cognitive biases that are drawn from Peter Facione’s excellent textbook, Think Critically. (I’ve also summarized these here, here, here, and here). I like the way Facione organizes and describes the major biases. His work is very teachable. And 17 is a manageable number of biases to teach and discuss.
While the 17 biases provide a good introduction to the topic, there are more biases that we need to be aware of. For instance, there’s the survivorship bias. Then there’s swimmer’s body fallacy. And the Ikea effect. And the self-herding bias. And don’t forget the fallacy fallacy. How many biases are there in total? Well, it depends on who’s counting and how many hairs we’d like to split. One author says there are 25. Another suggests that there are 53. Whatever the precise number, there are enough cognitive biases that leading consulting firms like McKinsey now have “debiasing” practices to help their clients make better decisions.
The ultimate list of cognitive biases probably comes from Wikipedia, which identifies 104 biases. (Click here and here). Frankly, I think Wikipedia is splitting hairs. But I do like the way Wikipedia organizes the various biases into four major categories. The categorization helps us think about how biases arise and, therefore, how we might overcome them. The four categories are:
1) Biases that arise from too much information – examples include: We notice things already primed in memory. We notice (and remember) vivid or bizarre events. We notice (and attend to) details that confirm our beliefs.
2) Not enough meaning – examples include: We fill in blanks from stereotypes and prior experience. We conclude that things that we’re familiar with are better in some regard than things we’re not familiar with. We calculate risk based on what we remember (and we remember vivid or bizarre events).
3) How we remember – examples include: We reduce events (and memories of events) to the key elements. We edit memories after the fact. We conflate memories that happened at similar times even though in different places or that happened in the same place even though at different times, … or with the same people, etc.
4) The need to act fast – examples include: We favor simple options with more complete information over more complex options with less complete information. Inertia – if we’ve started something, we continue to pursue it rather than changing to a different option.
It’s hard to keep 17 things in mind, much less 104. But we can keep four things in mind. I find that these four categories are useful because, as I make decisions, I can ask myself simple questions, like: “Hmmm, am I suffering from too much information or not enough meaning?” I can remember these categories and carry them with me. The result is often a better decision.
I just spotted this article on Inc. magazine’s website:
The article’s subhead is: “America’s 25 most admired CEOs have earned the respect of their people. Here’s how you can too.”
Does this sound familiar? It’s a good example of the survivorship fallacy. (See also here and here). The 25 CEOs selected for the article “survived” a selection process. The author then highlights the common behaviors among the 25 leaders. The implication is that — if you behave the same way — you too will become a revered leader.
Is it true? Well, think about the hundreds of CEOs who didn’t survive the selection process. I suspect that many of the unselected CEOs behave in ways that are similar to the 25 selectees. But the unselected CEOs didn’t become revered leaders. Why not? Hard to say …precisely because we’re not studying them. It’s not at all clear to me that I will become a revered leader if I behave like the 25 selectees. In fact, the reverse my be true — people may think that I’m being inauthentic and lose respect for me.
A better research method would be to select 25 leaders who are “revered” and compare them to 25 leaders who are not “revered”. (Defining what “revered” means will be slippery). By selecting two groups, we have some basis for comparison and contrast. This can often lead to deeper insights.
As it stands, the Inc. article reminds me of the book for teenagers called How To Be Popular. It’s cute but not very meaningful.
When I went off to college, my mother told me, “Now remember … you’re going to college to learn how to think. Don’t miss that lesson.”
I wonder what she would say if she were sending me off to college today. It might be more along the lines of, “Now remember … you’re going to college to get a good job. Don’t blow it.”
We can only judge programs and processes based on their goals. If the goal of government is to provide good services at a reasonable cost, we might give it a fairly low grade. However, if the goal of government is to increase employment, then we might evaluate it more positively. The same is true of higher education. So what is the goal of higher education? Is it to teach students how to think? Or is it to provide them skills to get a job?
I would argue that the goal of higher education – indeed of any education – is to improve the students’ ability to think. Good thinking can certainly help you get a job. In fact, it may be the ultimate job skill. But good thinking can take you much farther than a good job. Here’s my thinking on the issue:
1) Thinking is foundational — the essence of running a business (or a government) is to make decisions about the future. To make effective decisions, we need to understand how we think, how our thinking can be biased, and how to evaluate evidence and arguments. If we know everything about finance, for instance, but don’t know how to think effectively, we will make decisions based on faulty evidence, weak arguments, and unconscious biases. By chance, we might still make some good decisions. But we should remember Louis Pasteur’s thought, “Chance favors the prepared mind.”
2) The future is unknowable — our niece, Amelia, will graduate from college next May. If she works until she’s 65, she’ll retire in the year 2060. What skills will employers need in 2060? Who knows? As I reflect on my own education, the content I learned in college is largely useless today. The processes I learned, however, are still very relevant. Thinking is the ultimate process. Amelia will still need to think effectively in 2060.
3) Thinking promotes freedom – if we can’t think for ourselves, we will forever be buffeted by other people’s agendas, desires, ambitions, and rhetorical excesses. Critical thinking allows us to assess ideas and social movements and make effective decisions on our own. We can frame our thinking as we wish and not allow others to create frames for us. We can identify the truth rather than relying on others to tell us what is true. Critical thinking, allows us to take control of our own destiny, which is the essence of freedom. I don’t know of any other discipline that can make the same claim.
When faced with a difficult question, we often substitute a simpler question and answer that instead. Here are three examples:
In each case, we substitute a proxy for the original question. We assume that the proxy measures the same thing that the original question aimed to measure. Sometimes we’re right; sometimes we’re wrong. Most often, we don’t think about the fact that we’re using a proxy. System 1 does the thinking for us. But we can, in fact, bring the proxy to System 2 and evaluate whether it’s effective or not. If we think about it, we can use System 2 to spot errors in System 1. But we have to think about it.
As it happens, System 1 uses proxies in some situations that we might never think about. Here’s an example: How much food should you eat?
We tend to think of food in terms of quantity. System 1 also considers food as a source of energy. System 1 is trying to answer two questions: 1) How much energy does my body need? 2) How much food does that translate to?
Our bodies have learned that sweet food delivers more energy than non-sweet food and can use this to translate from energy needs to food requirements. Let’s say that the equation looks something like this:
1 calorie* of energy is generated by 10 grams of sweet food
Let’s also assume that our body has determined that we need 10 calories of energy. A simple calculation indicates that we need to eat 100 grams of sweet food. Once we’ve eaten 100 grams, System 1 can issue a directive to stop eating.
Now let’s change the scenario by introducing artificial sweeteners that add sweetness without adding many calories. The new translation table might look like this:
1 calorie of energy is generated by 30 grams of artificially sweetened food
If we still need 10 calories of energy, we will need to eat 300 grams of artificially sweetened food. System 1 issues a directive to stop only after we’ve eaten the requisite amount.
System 1 can’t tell the difference between artificially and naturally sweetened foods. It has only one translation table. If we eat a lot of artificially sweetened food, System 1 will learn the new translation table. If we then switch back to naturally sweetened foods, System 1 will still use the new translation table. It will still tell us to eat 300 grams of food to get 10 calories of energy.
We would never know that our brain makes energy/quantity assumptions if not for studies like this one. It’s not intuitively obvious that we need to invoke System 2 to examine the relationship between artificial sweeteners and food intake. But like crime rates or cars or shampoos, we often answer different questions than we think we’re answering. To think more clearly, we need to examine our proxies more carefully.
*It’s actually a kilocalorie of energy but we Americans refer to it as a calorie.