People often ask me why they should take a class in critical thinking. Their typical refrain is, “I already know how to think.” I find that the best answer is a story about the mistakes we often make.
So I offer up the following example, drawn from recent news, about very smart people who missed a critical clue because they were not thinking critically.
The story is about the conventional wisdom surrounding Alzheimer’s. We’ve known for years that people who have Alzheimer’s also have higher than normal deposits of beta amyloid plaques in their brains. These plaques build up over time and interfere with memory and cognitive processes.
The conventional wisdom holds that beta amyloid plaques are an aberration. The brain has essentially gone haywire and starts to attack itself. It’s a mistake. A key research question has been: how do we prevent this mistake from happening? It’s a difficult question to answer because we have no idea what triggered the mistake.
But recent research, led by Rudolph Tanzi and Robert Moir, considers the opposite question. What if the buildup of beta amyloid plaques is not a mistake? What if it serves some useful purpose? (Click here and here for background articles).
Pursuing this line of reasoning, Tanzi and Moir discovered the beta amyloid is actually an antimicrobial substance. It has a beneficial purpose: to attack bacteria and viruses and smother them. It’s not a mistake; it’s a defense mechanism.
Other Alzheimer’s researchers have described themselves as “gobsmacked” and “surprised” by the discovery. One said, “I never thought about it as a possibility.”
A student of critical thinking might ask, Why didn’t they think about this sooner? A key tenet of critical thinking is that one should always ask the opposite question. If conventional wisdom holds that X is true, a critical thinker would automatically ask, Is it possible that the opposite of X is true in some way?
Asking the opposite question is a simple way to identify, clarify, and check our assumptions. When the conventional wisdom is correct, it leads to a dead end. But, occasionally, asking the opposite question can lead to a Nobel Prize. Consider the case of Barry Marshall.
A doctor in Perth, Australia, Marshall was concerned about his patients’ stomach ulcers. Conventional wisdom held that bacteria couldn’t possibly live in the gastric juices of the human gut. So bacteria couldn’t possibly cause ulcers. More likely, stress and anxiety were the culprits. But Marshall asked the opposite question and discovered the bacteria now known a H. Pylori. Stress doesn’t cause ulcer, bacteria do. For asking the opposite question — and answering it — Marshall won the Nobel Prize in Medicine in 2005.
The discipline of critical thinking gives us a structure and method – almost a checklist – for how to think through complex problems. We should always ask the opposite question. We should be aware of common fallacies and cognitive biases. We should understand the basics of logic and argumentation. We should ask simple, blunt questions. We should check our egos at the door. If we do all this – and more – we tilt the odds in our favor. We prepare our minds systematically and open them to new possibilities – perhaps even the possibility of curing Alzheimer’s. That’s a good reason to study critical thinking.
It’s hard to think critically when you don’t know what you’re missing. As we think about improving our thinking, we need to account for two things that are so subtle that we don’t fully recognize them:
Because of assumptions and filters, we often talk past each other. The world is a confusing place and becomes even more confusing when our perception of what’s “out there” is unique. How can we overcome these effects? We need to consider two sets of questions:
The more we study assumptions and filters, the more attuned we become to their prevalence. When we make a decision, we’ll remember to inquire abut ourselves before we inquire about the world around us. That will lead us to better decisions.
In my critical thinking class, we begin by studying 17 cognitive biases that are drawn from Peter Facione’s excellent textbook, Think Critically. (I’ve also summarized these here, here, here, and here). I like the way Facione organizes and describes the major biases. His work is very teachable. And 17 is a manageable number of biases to teach and discuss.
While the 17 biases provide a good introduction to the topic, there are more biases that we need to be aware of. For instance, there’s the survivorship bias. Then there’s swimmer’s body fallacy. And the Ikea effect. And the self-herding bias. And don’t forget the fallacy fallacy. How many biases are there in total? Well, it depends on who’s counting and how many hairs we’d like to split. One author says there are 25. Another suggests that there are 53. Whatever the precise number, there are enough cognitive biases that leading consulting firms like McKinsey now have “debiasing” practices to help their clients make better decisions.
The ultimate list of cognitive biases probably comes from Wikipedia, which identifies 104 biases. (Click here and here). Frankly, I think Wikipedia is splitting hairs. But I do like the way Wikipedia organizes the various biases into four major categories. The categorization helps us think about how biases arise and, therefore, how we might overcome them. The four categories are:
1) Biases that arise from too much information – examples include: We notice things already primed in memory. We notice (and remember) vivid or bizarre events. We notice (and attend to) details that confirm our beliefs.
2) Not enough meaning – examples include: We fill in blanks from stereotypes and prior experience. We conclude that things that we’re familiar with are better in some regard than things we’re not familiar with. We calculate risk based on what we remember (and we remember vivid or bizarre events).
3) How we remember – examples include: We reduce events (and memories of events) to the key elements. We edit memories after the fact. We conflate memories that happened at similar times even though in different places or that happened in the same place even though at different times, … or with the same people, etc.
4) The need to act fast – examples include: We favor simple options with more complete information over more complex options with less complete information. Inertia – if we’ve started something, we continue to pursue it rather than changing to a different option.
It’s hard to keep 17 things in mind, much less 104. But we can keep four things in mind. I find that these four categories are useful because, as I make decisions, I can ask myself simple questions, like: “Hmmm, am I suffering from too much information or not enough meaning?” I can remember these categories and carry them with me. The result is often a better decision.
Red people and blue people are at it again. Neither side seems to accept that the other side consists of real people with real ideas that are worth listening to. Debate is out. Contempt is in.
As a result, our nation is highly polarized. To work our way out of the current stalemate, we need to listen closely and speak wisely. We need to debate effectively rather than arguing angrily. Here are some tips:
It’s not about winning, it’s about winning over – too often we talk about winning an argument. But defeating an opponent is not the same as winning him over to your side. Aim for agreement, not a crushing blow.
It’s not about values – our values are deeply held. We don’t change them easily. You’re not going to convert a red person into a blue person or vice-versa. Aim to change their minds, not their values.
Stick to the future tense – the only reason to argue in the past tense is to assign blame. That’s useful in a court of law but not in the court of public opinion. Stick to the future tense, where you can present choices and options. That’s where you can change minds. (Tip: don’t ever argue with a loved one in the past tense. Even if you win, you lose.)
The best way to disagree is to begin by agreeing – the other side wants to know that you take them seriously. If you immediately dismiss everything they say, you’ll never persuade them. Start by finding points of agreement. Even if you’re at opposite ends of the spectrum, you can find something to agree to.
Don’t fall for the anger mongers – both red and blue commentators prey on our pride to sell anger. They say things like, “The other side hates you. They think you’re dumb. They think they’re superior to you.” The technique is known as attributed belittlement and it’s the oldest trick in the book. Don’t fall for it.
Don’t fall into the hypocrisy trap – both red and blue analysts are willing to spin for their own advantage. Don’t assume that one side is hypocritical while the other side is innocent.
Beware of demonizing words – it’s easy to use positive words for one side and demonizing words for the other side. For example: “We’re proud. They’re arrogant.” “We’re smart. They’re sneaky.” It’s another old trick. Don’t fall for it.
Show some respect – just because people disagree with you is no reason to treat them with contempt. They have their reasons. Show some respect even if you disagree.
Be skeptical – the problems we’re facing as a nation are exceptionally complex. Anyone who claims to have a simple solution is lying.
Burst your bubble – open yourself up to sources you disagree with. Talk with people on the other side. We all live in reality bubbles. Time to break out.
Give up TV — talking heads, both red and blue, want to tell you what to think. Reading your own sources can help you learn how to think.
Aim for the persuadable – you’ll never convince some people. Don’t waste your breath. Talk with open-minded people who describe themselves as moderates. How can you tell they’re open-minded? They show respect, don’t belittle, agree before disagreeing, and are skeptical of both sides.
Engage in arguments – find people who know how to argue without anger. Argue with them. If they’re red, take a blue position. If they’re blue, take a red position. Practice the art of arguing. You’re going to need it.
Remember that the only thing worse than arguing is not arguing – We know how to argue. Now we need to learn to argue without anger. Our future may depend on it.
Most people (in America at least) would probably agree with the following statement:
Men are bigger risk takers than women.
Several research studies seem to have documented this. Researchers have asked people what risky behaviors they engage in (or would like to engage in). For instance, they might ask a randomly selected group of men and women whether they would like to jump out of an airplane (with a parachute). Men – more often than women – say that this is an appealing idea. Ask about driving a motorcycle and the response is more or less the same. Men are interested, women not so much. QED: men are bigger risk takers than women.
But are we taking a conceptual leap here (without a parachute)? How do we know if something is true? What’s the operational definition of “risk”? Should we be engaging our baloney detectors right about now?
In her new book, Testosterone Rex, Cordelia Fine suggests that we’ve pretty much got it all backwards. The problem with the using skydiving and motorcycle driving as proxies for risk is that they are far too narrow. Indeed, they are narrowly masculine definitions of risk. So, in effect, we’re asking a different question:
Would you like to engage in activities that most men define as risky?
It’s a circular argument. We give a masculine definition of risk and then conclude that men are more likely to engage in that activity than women. No duh.
Fine points out that, “In the United States, being pregnant is about 20 times more likely to result in death than is a sky dive.” So which gender is really taking the big risks?
As with so many issues in logic and critical thinking, we need to examine our definitions. If we define our variables in narrow ways, we’ll get narrow and – most likely – biased results.
Fine writes that many people believe in in Testosterone Rex – that differences between man and women are biological and driven largely by hormonal effects. But when she examines the evidence, she finds one logical flaw after another. Researchers skew definitions, reverse cause-and-effect, and use small samples to produce large (and unsupported) conclusions.
Ultimately, Fine concludes that we aren’t born as males and females in the traditional way that we think about gender. Rather, when we’re born, society starts to shape us into society’s conception of what the gender ought to be. It’s a bracing and clearly argued point that seems to be backed up by substantial evidence.
It’s also a great example of baloney detection and a good case study for any class in critical thinking.