In my critical thinking class, we investigate a couple of dozen cognitive biases — fallacies in the way our brains process information and reach decisions. These include the confirmation bias, the availability bias, the survivorship bias, and many more. I call these factory-installed biases – we’re born this way.
But we haven’t asked the question behind the biases: why are we born that way? What’s the point of thinking fallaciously? From an evolutionary perspective, why haven’t these biases been bred out of us? After all, what’s the benefit of being born with, say, the confirmation bias?
Elizabeth Kolbert has just published an interesting article in The New Yorker that helps answer some of these questions. (Click here). The article reviews three new books about how we think:
Kolbert writes that the basic idea that ties these books together is sociability as opposed to logic. Our brains didn’t evolve to be logical. They evolved to help us be more sociable. Here’s how Kolbert explains it:
“Humans’ biggest advantage over other species is our ability to coöperate. Coöperation is difficult to establish and almost as difficult to sustain. For any individual, freeloading is always the best course of action. Reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; rather, it developed to resolve the problems posed by living in collaborative groups.”
So, the confirmation bias, for instance, doesn’t help us make good, logical decisions but it does help us cooperate with others. If you say something that confirms what I already believe, I’ll accept your wisdom and think more highly of you. This helps us confirm our alliance to each other and unifies our group. I know I can trust you because you see the world the same way I do.
If, on the other hand, someone in another group says something that disconfirms my belief, I know the she doesn’t agree with me. She doesn’t see the world the same way I do. I don’t see this as a logical challenge but as a social challenge. I doubt that I can work effectively with her. Rather than checking my facts, I check her off my list of trusted cooperators. An us-versus-them dynamic develops, which solidifies cooperation in my group.
Mercier and Sperber, in fact, change the name of the confirmation bias to the “myside bias”. I cooperate with my side. I don’t cooperate with people who don’t confirm my side.
Why wouldn’t the confirmation/myside bias have gone away? Kolbert quotes Mercier and Sperber: ““This is one of many cases in which the environment changed too quickly for natural selection to catch up.” All we have to do is wait 1,000 generations or so. Or maybe we can program artificial intelligence to solve the problem.
When I worked for business-to-business software vendors, I often met companies that were simply out of date. They hadn’t caught up with the latest trends and buzzwords. They used inefficient processes and outdated business practices.
Why were they so far behind? Because that’s the way their software worked. They had loaded an early version of a software system (perhaps from my company) and never upgraded it. The system became comfortable. It was the ways they had always done it. If it ain’t broke, don’t fix it.
I’ve often wondered if we humans don’t do the same thing. Perhaps we load the software called Human 1.0 during childhood and then just forget about it. It works. It gets us through the day. It’s comfortable. Don’t mess with success.
Fixing the problem for companies was easy: just buy my new software. But how do we solve the problem (if it is a problem) for humans? How do we load Human 2.0? What patches do we need? What new processes do we need to learn? What new practices do we need to adopt?
As a teacher of critical thinking, I’d like to think that critical thinking is one element of such an upgrade. When we learn most skills – ice skating, piano playing, cooking, driving, etc. – we seek out a teacher to help us master the craft. We use a teacher – and perhaps a coach – to help us upgrade our skills to a new level.
But not so with thinking. We think we know how to think; we’ve been doing it all our lives. We don’t realize that thinking is a skill like any other. If we want to get better at basketball, we practice. If we want to get better at thinking, ….well, we don’t really want to get better at thinking, do we? We assume that we’re good enough. If the only thinking we know is the thinking that we do, then we don’t see the need to change our thinking.
So how do we help people realize that they can upgrade their thinking? Focusing on fallacies often works. I often start my classes by asking students to think through the way we make mistakes. For instance, we often use short cuts – more formally known as heuristics – to reach decisions quickly. Most of the time they work – we make good decisions and save time in the process. But when they don’t work, we make very predictable errors. We invade the wrong country, marry the wrong person, or take the wrong job.
When we make big mistakes, we can draw one of two conclusions. On the one hand, we might conclude that we made a mistake and need to rethink our thinking. On the other hand, we might conclude that our thinking was just fine but that our political opponents undermined our noble efforts. If not for them, everything would be peachy. The second conclusion is lazy and popular. We’re not responsible for the mess – someone else is.
But let’s focus for a moment on the first conclusion – we realize that we need to upgrade our thinking. Then what? Well… I suppose that everyone could sign up for my critical thinking class. But what if that’s not enough? As people realize that there are better ways to think, they’ll ask for coaches, and teachers, and gurus.
If you’re an entrepreneur, there’s an opportunity here. I expect that many companies and non-profit organizations will emerge to promote the need and service the demand. The first one I’ve spotted is the Center for Applied Rationality (CFAR). Based in Berkeley (of course), CFAR’s motto is “Turning Cognitive Science Into Cognitive Practice”. I’ve browsed through their web site and read a very interesting article in the New York Times (click here). CFAR seems to touch on many of the same concepts that I use in my critical thinking class – but they do it on a much grander scale.
If I’m right, CFAR is at the leading edge of an interesting new wave. I expect to see many more organizations pop up to promote rationality, cognitive enhancements, behavioral economics, or … to us traditional practitioners, critical thinking. Get ready. Critical thinking is about to be industrialized. Time to put your critical thinking cap on.
How much does your body affect your brain? A lot more than we might have guessed even just a few years ago. The general concept — known as embodied cognition – holds that the body and the brain are one system, not two. (Sorry, Descartes). What the body is doing affects what the brain is thinking.
I’ve written about embodied cognition before (here and here). Recently, I’ve seen a spate of new stories that extend our understanding. Here’s a summary:
The power pose – want to perform better in an upcoming job interview? Just before the interview, strike a power pose for two minutes. Your testosterone will go up and your cortisol will go down. You’ll be more confident and assertive and knock ’em dead in the interview. Amy Cuddy explains it all in the second most-watched TED video ever.
Willpower, dissension, and glucose – If you run ten miles, you’ll deplete your energy reserves. You may need to relax and refuel before taking up a new physical challenge. Does the same thing happen with willpower? Apparently so. If you resist the temptation to smoke a cigarette, you’ll have less willpower left to resist eating a donut. You can use up willpower just like you use up physical power. Perhaps that’s why you’re more likely to argue with your spouse when your glucose levels are low. If you sip a glass of lemonade, you might just avoid the argument altogether.
Musicians have better memories – experiments at the University of Texas suggest that professional musicians have better short- and long-term memories than the rest of us. For short-term memory (working memory), the musicians are better at both verbal and pictorial recall. For long-term memory, they’re better at pictorial recall. Maybe we should invest more in musical education.
How you walk affects your mood – as the Scientific American points out, “A good mood may put a spring in your step. But the opposite can work too: purposefully putting a spring in your step can improve your mood.” As Science Daily points out, the opposite is also true. If you walk with slumped shoulders and head down, you’ll eventually get grumpy. Your Mom was right: standing up straight actually does affect your mood and performance.
Intuition may just be your body talking to you – when you get nervous, your palms may start to sweat. Your mood is affecting your body, right? Well, maybe it’s the other way round. Your intuition (also known as System 1) senses that something is amiss. It needs to get your (System 2) attention somehow. What’s the best way? How about sweaty palms and a racing heartbeat? They’re simple, effective signaling techniques that are hard to ignore.
The power of a pencil – want to get happy? Hold a pencil in your mouth like the woman in the picture. Your facial muscles act as if they’re smiling. You may consciously realize that you’re not smiling but it doesn’t really matter – your body is doing the thinking for you.
Remember heuristics? They’re the rules of thumb that allow us to make snap judgments, using System 1, our fast, automatic, and ever-on thinking system. They can also lead us into errors. According to psychologists there are least 17 errors that we commonly make. In previous articles, I’ve written about seven of them (click here and here). Let’s look at four more today.
Association — word association games are a lot of fun. (Actually, words are a lot of fun). But making associations and then drawing conclusions from them can get you into trouble. You say tomato and I think of the time I ate a tomato salad and got sick. I’m not going to do that again. That’s not good hygiene or good logic. The upside is that word associations can lead you to some creative thinking. You can make connections that you might otherwise have missed. And, as we all know, connections are the foundation of innovation. Just be careful about drawing conclusions.
Power differential — did you ever work for a boss with a powerful personality? Then you know something about this heuristic. Socially and politically, it may be easier to accept an argument made by a “superior authority” than it is to oppose it. It’s natural. We tend to defer to those who have more power or prestige than we do. Indeed, there’s an upside here as well. It’s called group harmony. Sometimes you do need to accept your spouse’s preferences even if they differ from yours. The trick is to recognize when preferences are merely a matter of taste versus preferences that can have significant negative results. As Thomas Jefferson said, “On matters of style, swim with the current. On matters of principle, stand like a rock”.
Illusion of control — how much control do you really have over processes and people at your office? It’s probably a lot less than you think. I’ve worked with executives who think they’ve solved a problem just because they’ve given one good speech. A good speech can help but it’s usually just one step in a long chain of activities. Here’s a tip for spotting other people who have an illusion of control. They say I much more often than we. It’s poor communication and one of the worst mistakes you can make in a job interview. (Click here for more).
Loss and risk aversion — let’s just keep doing what we’re doing. Let’s not change things … we might be worse off. Why take risks? It happens that risk aversion has a much bigger influence on economic decisions than we once thought. In Thinking Fast and Slow, Daniel Kahneman writes about our unbalanced logic when considering gain versus loss — we fear loss more than we’re attracted by gain. In general terms, the pain of a loss is about double the pleasure of a gain. So, emotionally, it takes a $200 gain to balance a $100 loss. Making 2-to-1 decisions may be good for your nerves but it often means that you’ll pass up good economic opportunities.
To prepare this article, I drew primarily on Peter Facione’s Think Critically. (Click here) Daniel Kahneman’s book is here.
Do generals commit adultery more often than, say, elementary school teachers?
The way we answer this question says a lot about the way we think. If you’ve been reading about American generals recently, you know that a lot of top ranking officers have been caught with their hands in the cookie jar. The facts are easily available to you. You can recall them quickly. Indeed, they’re very likely top of mind. (One of my students asked, in mock horror, since when have generals taken orders from their privates?)
On the other hand, when was the last time you read about cheating primary school teachers? It’s probably been a long time, if ever. Why? Because stories about cheating teachers don’t sell many newspapers. Stories about cheating generals seize our attention and hold it. It’s a great way to sell newspapers, magazines, and TV shows.
So, it’s easy for you to remember stories about cheating generals. It’s much harder to remember stories about cheating teachers. Based on your ability to remember relevant cases, you might conclude that generals do indeed stray more often than teachers. Would you be right? Maybe … but maybe not. All you’ve really done is search your own memory banks. As we all know, memory is fallible and can easily play tricks on us.
When we’re asked a comparative question like generals versus teachers, we often try to answer a different question: how many cases of each can I readily recall? It’s an easier question to answer and doesn’t require us to search external sources and think hard thoughts. Though it’s easy, it’s often erroneous.
I think I saw this phenomenon in action during the recent presidential election. My friends who supported Obama tended to talk to other people who supported Obama. If you asked how many people would support Obama, they could readily retrieve many cases and conclude that Obama would win. Of course, my friends who supported Romney were doing exactly the same thing — talking with or listening to other Romney supporters. I heard one person say, “Of course Romney will win. Everybody hates Obama”. I suspect that everybody he talked to hated Obama. But that’s not the same as everybody.
Relying on easily available information can help create the political chasms that we see around us. If you read a lot of articles about intransigent Republicans, you may conclude that Republicans are more intransigent than Democrats. That may be true … or it could just be a product of what you remember. Similarly, if you read lots of articles about Democrats undercutting the military, you might come to believe …. well, you get the picture.
What should we do? First, remember that the easy answer is often the wrong answer. It depends on what we remember rather than what’s actually happening. Second, start reading more sources that “disagree” with your point of view. All information sources have some degree of bias. Reading widely can help you establish a balance. Third, study up on statistics. It will help you understand what’s accurate and what’s not.
By the way, this post is adapted from Thinking Fast and Slow by Daniel Kahneman, easily the best book I’ve read this year. You can find it here.
(Note: I’ll teach a class on Applied Critical Thinking during the winter term at the University of Denver. Some of my teaching material will show up here in posts about how we think. They’ll all carry the tag, Applied Critical Thinking, so you can find them easily).