When I worked for business-to-business software vendors, I often met companies that were simply out of date. They hadn’t caught up with the latest trends and buzzwords. They used inefficient processes and outdated business practices.
Why were they so far behind? Because that’s the way their software worked. They had loaded an early version of a software system (perhaps from my company) and never upgraded it. The system became comfortable. It was the ways they had always done it. If it ain’t broke, don’t fix it.
I’ve often wondered if we humans don’t do the same thing. Perhaps we load the software called Human 1.0 during childhood and then just forget about it. It works. It gets us through the day. It’s comfortable. Don’t mess with success.
Fixing the problem for companies was easy: just buy my new software. But how do we solve the problem (if it is a problem) for humans? How do we load Human 2.0? What patches do we need? What new processes do we need to learn? What new practices do we need to adopt?
As a teacher of critical thinking, I’d like to think that critical thinking is one element of such an upgrade. When we learn most skills – ice skating, piano playing, cooking, driving, etc. – we seek out a teacher to help us master the craft. We use a teacher – and perhaps a coach – to help us upgrade our skills to a new level.
But not so with thinking. We think we know how to think; we’ve been doing it all our lives. We don’t realize that thinking is a skill like any other. If we want to get better at basketball, we practice. If we want to get better at thinking, ….well, we don’t really want to get better at thinking, do we? We assume that we’re good enough. If the only thinking we know is the thinking that we do, then we don’t see the need to change our thinking.
So how do we help people realize that they can upgrade their thinking? Focusing on fallacies often works. I often start my classes by asking students to think through the way we make mistakes. For instance, we often use short cuts – more formally known as heuristics – to reach decisions quickly. Most of the time they work – we make good decisions and save time in the process. But when they don’t work, we make very predictable errors. We invade the wrong country, marry the wrong person, or take the wrong job.
When we make big mistakes, we can draw one of two conclusions. On the one hand, we might conclude that we made a mistake and need to rethink our thinking. On the other hand, we might conclude that our thinking was just fine but that our political opponents undermined our noble efforts. If not for them, everything would be peachy. The second conclusion is lazy and popular. We’re not responsible for the mess – someone else is.
But let’s focus for a moment on the first conclusion – we realize that we need to upgrade our thinking. Then what? Well… I suppose that everyone could sign up for my critical thinking class. But what if that’s not enough? As people realize that there are better ways to think, they’ll ask for coaches, and teachers, and gurus.
If you’re an entrepreneur, there’s an opportunity here. I expect that many companies and non-profit organizations will emerge to promote the need and service the demand. The first one I’ve spotted is the Center for Applied Rationality (CFAR). Based in Berkeley (of course), CFAR’s motto is “Turning Cognitive Science Into Cognitive Practice”. I’ve browsed through their web site and read a very interesting article in the New York Times (click here). CFAR seems to touch on many of the same concepts that I use in my critical thinking class – but they do it on a much grander scale.
If I’m right, CFAR is at the leading edge of an interesting new wave. I expect to see many more organizations pop up to promote rationality, cognitive enhancements, behavioral economics, or … to us traditional practitioners, critical thinking. Get ready. Critical thinking is about to be industrialized. Time to put your critical thinking cap on.
I speak Spanish reasonably well but I find it very tiring … which suggests that I probably think more clearly and ethically in Spanish than in English.
Like so many things, it’s all related to our two different modes of thinking: System1 and System 2. System 1 is fast and efficient and operates below the level of consciousness. It makes a great majority of our decisions, typically without any input from our conscious selves. We literally make decisions without knowing that we’re making decisions.
System 2 is all about conscious thought. We bring information into System 2, think it through, and make reasoned decisions. System 2 uses a lot of calories; it’s hard work. As Daniel Kahneman says, “Thinking is to humans as swimming is to cats; they can do it but they’d prefer not to.”
English, of course, is my native language. (American English, that is). It’s second nature to me. It’s easy and fluid. I can think in English without thinking about it. In other words, English is the language of my System 1. At this point in my life, it’s the only language in my System 1 and will probably remain so.
To speak Spanish, on the other hand, I have to invoke System 2. I have to think about my word choice, pronunciation, phrasing, and so on. It’s hard work and wears me out. I can do it but I would have to live in Spain for a while for it to become easy and fluid. (That’s not such a bad idea, is it?)
You may remember that System 1 makes decisions using heuristics or simple rules of thumb. System 1 simplifies everything and makes snap judgments. Most of the time, those judgments are pretty good but, when they’re wrong, they’re wrong in consistent ways. System 1, in other words, is the source of biases that we all have.
To overcome these biases, we have to bring the decision into System 2 and consider it rationally. That takes time, effort, and energy and, oftentimes, we don’t do it. It’s easy to conclude that someone is a jerk. It’s more difficult to invoke System 2 to imagine what that person’s life is like.
So how does language affect all this? I can only speak Spanish in my rational, logical, conscious System 2. When I’m thinking in Spanish, all my rational neurons are firing. I tend to think more carefully, more thoughtfully, and more ethically. It’s tiring.
When I think in English, on the other hand, I could invoke my System 2 but I certainly don’t have to. I can easily use heuristics in English but not in Spanish. I can jump to conclusions in English but not in Spanish.
The seminal article on this topic was published in 2012 by three professors from the University of Chicago. They write, “Would you make the same decisions in a foreign language as you would in your native tongue? It may be intuitive that people would make the same choices regardless of the language they are using…. We discovered, however, that the opposite is true: Using a foreign language reduces decision-making biases.”
So, it’s true: I’m a better person in Spanish.
In my critical thinking classes, students get a good dose of heuristics and biases and how they affect the quality of our decisions. Daniel Kahneman and Amos Tversky popularized the notion that we should look at how people actually make decisions as opposed to how they should make decisions if they were perfectly rational.
Most of our decision-making heuristics (or rules of thumb) work most of the time but when they go wrong, they do so in predictable and consistent ways. For instance, we’re not naturally good at judging risk. We tend to overestimate the risk of vividly scary events and underestimate the risk of humdrum, everyday problems. If we’re aware of these biases, we can account for them in our thinking and, perhaps, correct them.
Finding that our economic decisions are often irrational rather than rational has created a whole new field, generally known as behavioral economics. The field ties together concepts as diverse as the availability bias, the endowment effect, the confirmation bias, overconfidence, and hedonic adaptation to explain how people actually make decisions. Though it’s called economics, the basis is psychology.
So does this mean that traditional, rational, statistical, academic decision-making is dead? Well, not so fast. According Justin Fox’s article in a recent issue of Harvard Business Review, there are at least three philosophies of decision-making and each has its place.
Fox acknowledges that, “The Kahneman-Tversky heuristics-and-biases approach has the upper hand right now, both in academia and in the public mind.” But that doesn’t mean that it’s the only game in town.
The traditional, rational, tree-structured logic of formal decision analysis hasn’t gone away. Created by Ronald Howard, Howard Raiffa, and Ward Edwards, Fox argues that the classic approach is best suited to making “Big decisions with long investment horizons and reliable data [as in] oil, gas, and pharma.” Fox notes that Chevron is a major practitioner of the art and that Nate Silver, famous for accurately predicting the elections of 2012, was using a Bayesian variant of the basic approach.
And what about non-rational heuristics that actually do work well? Let’s say, for instance, that you want to rationally allocate your retirement savings across N different investment options. Investing evenly in each of the N funds is typically just as good as any other approach. Know as the 1/N approach, it’s a simple heuristic that leads to good results. Similarly, in choosing between two options, selecting the one you’re more familiar with usually creates results that are no worse than any other approach – and does so more quickly and at much lower cost.
Fox calls this the “effective heuristics” approach or, more simply, the gut-feel approach. Fox suggests that this is most effective, “In predictable situations with opportunities for learning, [such as] firefighting, flying, and sports.” When you have plenty of practice in a predictable situation, your intuition can serve you well. In fact, I’d suggest that the famous (or infamous) interception at the goal line in this year’s Super Bowl resulted from exactly this kind of thinking.
And where does the heuristics-and-biases model fit best? According to Fox, it helps us to “Design better institutions, warn ourselves away from dumb mistakes, and better understand the priorities of others.”
So, we have three philosophies of decision-making and each has its place in the sun. I like the heuristics-and-biases approach because I like to understand how people actually behave. Having read Fox, though, I’ll be sure to add more on the other two philosophies in upcoming classes.
We went to the airport the other day and realized that we were out of cash. I stopped at an ATM, pulled out a credit card, and froze. I rarely use that particular card at ATMs and I had completely forgotten the personal identification number. I stared blankly at the ATM screen for a few minutes and then slowly started to walk away.
A few seconds later, the number popped into my head: 2061. I’m used to having things pop into my head as I “give up” on a problem. When I focus on a problem, I block out information. As I start to unfocus, useful information pops back into my head. I find that I’m much less creative when I’m intently focused. (Recently, for instance, Steven Wright popped into my head.)
Pleased that my mind was working so effectively, I returned to the ATM, inserted my card and the digits 2061. Wrong. Hmmm … perhaps I transposed some digits. I tried various combinations: 2601, 2106, 1206, and so on. Nothing worked.
So again, I walked slowly away from the terminal. As I did, I noticed that I was standing next to an airport conference room. The number on the door: 2061. My System 1 had picked up the number subconsciously. It wasn’t a useful data point so System 1 didn’t register it with System 2. Then my System 2 broadcast a message: “We’re looking for a four digit number.” At that point, System 1 produced the most recent four-digit number it was aware of: 2061.
Unfortunately, it was the wrong number. But I was convinced it was the right number. It popped into my head just the way I expected it to.
Was my mind playing tricks on me? Not really. In David Brooks’ phrase, my “millions of little scouts” were out surveying my environment. One scout sent back some information that might be useful, the number 2061. The little scout was trying to help. Unfortunately, he led me astray. System 1 is usually right. But when it’s wrong, it can get you into big trouble. Like getting your credit card cancelled
I’m a pretty good driver. How do I know? I can observe other drivers and compare their skills to mine. I see them making silly mistakes. I (usually) avoid those mistakes myself. QED: I must be a better-than-average driver. I’d like to stay that way and that motivates me to practice my driving skills.
Using observation and comparison, I can also conclude that I’m not a very good basketball player. I can observe what other players do and compare their skills to mine. They’re better than I am. That may give me the motivation to practice my hoops skills.
Using observation and comparison I can conclude that I’m better at driving the highway than at driving the lane. But how do I know if I’m a good thinker or not? I can’t observe other people thinking. Indeed, according to many neuroscientists, I can’t even observe myself thinking. System 1 thinking happens below the level of conscious awareness. So I can’t observe and compare.
Perhaps I could compare the results of thinking rather than thinking itself. People who are good thinkers should be more successful than those who aren’t, right? Well, maybe not. People might be successful because they’re lucky or charismatic, or because they were born to the right parents in the right place. I’m sure that we can all think of successful people who aren’t very good thinkers.
So, how do we know if we’re good thinkers or not? Well, most often we don’t. And, because we can’t observe and compare, we may not have the motivation to improve our thinking skills. Indeed, we may not realize that we can improve our thinking.
I see this among the students in my critical thinking class. Students will have varying opinions about their own thinking skills. But most of them have not thought about their thinking and how to improve it.
Some of my students seem to think they’re below average thinkers. In their papers, they write about the mistakes they’ve made and how they berate themselves for poor thinking. They can’t observe other people making the same mistake so they assume that they’re the only ones. Actually, the mistakes seem fairly commonplace to me and I write a lot of comments along these lines, “Don’t beat yourself over this. Everybody make this mistake.”
Some of my students, of course, think they’re above average thinkers. Some (though not many) think they’re about average. But I think the single largest group – maybe not a majority but certainly a plurality – think they’re below average.
I realized recently that the course aims to build student confidence (and motivation) by making thinking visible. When we can see how people think, then we can observe and compare. So we look at thinking processes and catalog the common mistakes people make. As we discuss these patterns, I often hear students say, “Oh, I thought I was the only one to do that.”
In general, students get the hang of it pretty quickly. Once they can observe external patterns and processes, they’re very perceptive about their own thinking. Once they can make comparisons, they seem highly motivated to practice the arts of critical thinking. It’s like driving or basketball – all it takes is practice.