In my critical thinking class, we investigate a couple of dozen cognitive biases — fallacies in the way our brains process information and reach decisions. These include the confirmation bias, the availability bias, the survivorship bias, and many more. I call these factory-installed biases – we’re born this way.
But we haven’t asked the question behind the biases: why are we born that way? What’s the point of thinking fallaciously? From an evolutionary perspective, why haven’t these biases been bred out of us? After all, what’s the benefit of being born with, say, the confirmation bias?
Elizabeth Kolbert has just published an interesting article in The New Yorker that helps answer some of these questions. (Click here). The article reviews three new books about how we think:
Kolbert writes that the basic idea that ties these books together is sociability as opposed to logic. Our brains didn’t evolve to be logical. They evolved to help us be more sociable. Here’s how Kolbert explains it:
“Humans’ biggest advantage over other species is our ability to coöperate. Coöperation is difficult to establish and almost as difficult to sustain. For any individual, freeloading is always the best course of action. Reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; rather, it developed to resolve the problems posed by living in collaborative groups.”
So, the confirmation bias, for instance, doesn’t help us make good, logical decisions but it does help us cooperate with others. If you say something that confirms what I already believe, I’ll accept your wisdom and think more highly of you. This helps us confirm our alliance to each other and unifies our group. I know I can trust you because you see the world the same way I do.
If, on the other hand, someone in another group says something that disconfirms my belief, I know the she doesn’t agree with me. She doesn’t see the world the same way I do. I don’t see this as a logical challenge but as a social challenge. I doubt that I can work effectively with her. Rather than checking my facts, I check her off my list of trusted cooperators. An us-versus-them dynamic develops, which solidifies cooperation in my group.
Mercier and Sperber, in fact, change the name of the confirmation bias to the “myside bias”. I cooperate with my side. I don’t cooperate with people who don’t confirm my side.
Why wouldn’t the confirmation/myside bias have gone away? Kolbert quotes Mercier and Sperber: ““This is one of many cases in which the environment changed too quickly for natural selection to catch up.” All we have to do is wait 1,000 generations or so. Or maybe we can program artificial intelligence to solve the problem.
I just spotted this article on Inc. magazine’s website:
The article’s subhead is: “America’s 25 most admired CEOs have earned the respect of their people. Here’s how you can too.”
Does this sound familiar? It’s a good example of the survivorship fallacy. (See also here and here). The 25 CEOs selected for the article “survived” a selection process. The author then highlights the common behaviors among the 25 leaders. The implication is that — if you behave the same way — you too will become a revered leader.
Is it true? Well, think about the hundreds of CEOs who didn’t survive the selection process. I suspect that many of the unselected CEOs behave in ways that are similar to the 25 selectees. But the unselected CEOs didn’t become revered leaders. Why not? Hard to say …precisely because we’re not studying them. It’s not at all clear to me that I will become a revered leader if I behave like the 25 selectees. In fact, the reverse my be true — people may think that I’m being inauthentic and lose respect for me.
A better research method would be to select 25 leaders who are “revered” and compare them to 25 leaders who are not “revered”. (Defining what “revered” means will be slippery). By selecting two groups, we have some basis for comparison and contrast. This can often lead to deeper insights.
As it stands, the Inc. article reminds me of the book for teenagers called How To Be Popular. It’s cute but not very meaningful.
In last year’s NCAA football championship game, Alabama beat Clemson by a score of 45 to 40.
In this year’s NCAA football championship game, Clemson beat Alabama by a score of 35 to 31.
The aggregate score is 76 to 75 in favor of Alabama.
So, which team is more skilled?
To ponder the question, we need to return to Michael Mauboussin’s ideas* about skill and luck – and, especially, his concept of the paradox of skill.
Let’s start with definitions for skill and luck. For Mauboussin, a key question helps us identify skill: Can I lose on purpose? If the answer is yes, then some skill must be involved in the process, whether you’re shooting hoops or playing poker. If the answer is no, then the process is random – it’s a matter of luck.
Most processes – like NCAA football games – involve both skill and luck. How can we sort out the differences between the two? Was Alabama more skilled last year or just luckier? What about Clemson this year?
Mauboussin’s paradox of skill can help us sort this out. Simply put, the paradox states that: “In activities that involve some luck, the improvement of skill makes luck more important…” We have training programs that can improve skills in many competitive activities, including sports, business performance, combat, and perhaps, even investing. As more people take advantage of these programs and average skill levels improve, you might think that luck would become less important in determining outcomes.
Mauboussin says that exactly the opposite is true. The big issue is skill differential and distribution. If a given skill is unevenly distributed in a society, then skill likely determines the outcome. Luck doesn’t have a chance to worm its way in. On the other hand, if skill is broadly and evenly distributed, then even minor fluctuations in luck can change the outcome.
As an example, Mauboussin cites the difference between the winning time and the time for the 20th finisher in the men’s Olympic marathon. In 1932, the difference was 39 minutes. In 2012, it was 7.5 minutes. Clearly, the skill of marathon running has become more evenly distributed over the past 80 years. We have more people with greater skills more evenly distributed than we had in the past. As a result, the marathon has become much more competitive.
Paradoxically, as the marathon has become more competitive, luck plays a greater role. Let’s say that the 1932 winner had the bad luck of stepping in a pothole at Mile 22 and had to limp to the finish line. Because he had so much more skill than the other runners, he might still have won the race. If the 2012 winner stepped in the same pothole, chances are the other (highly skilled) runners would have caught and passed him. He would have lost because of bad luck.
The paradox of skill should teach us some humility and helps to illuminate the illusion of control. We may think we’re successful because we’re skilled and talented and can control the events around us. But oftentimes – especially when skill is evenly distributed – it’s nothing more than an illusion. It’s just plain luck.
And what about Clemson and Alabama? My interpretation is that both teams are perfectly balanced in terms of skills. So the outcome depends almost entirely on luck: a lucky bounce, a stray breeze, a bad call, a slippery turf, and so on. Let’s celebrate two great teams that have separated themselves from the pack but not from each other. Perhaps we should call them Clembama.
* I used several sources for Mauboussin’s ideas. His 2012 book, The Success Equation, is here. In 2012, he also gave a very succinct presentation to the CFA Institute. That paper is here. His HBR article from 2011 is here. In 2014, he gave a lecture as part of the Authors at Google series – you can find the video here. And David Hurst’s very enlightening review of Mauboussin’s book is here.
I’ve written at various times about embodied cognition – the idea that the body influences the mind. (See here, here, and here.) In other words, our mind is not limited to our brain. We think with our bodies as well. You can improve your confidence by making yourself big. You can brighten your mood by putting a smile on your face. Want to feel morally pure? Take a bath.
How far does this extend? The clothes you wear, for instance, touch your body and mediate between your body and the world around you. It’s fair to ask: do the clothes you wear influence your thinking?
The answer is yes. Hajo Adam and Adam Galinsky introduced the term “enclothed cognition” in an article in the Journal of Experimental Social Psychology in July 2012. (Click here). They write that enclothed cognition describes, “…the systematic influence that clothes have on the wearer’s psychological processes.” They also suggest that two factors come into play: “the symbolic meaning of the clothes and the physical experience of wearing them.”
Many clothes have symbolic value. Take the humble white coat. In a hospital setting, we might assume that someone wearing a white coat is an expert or an authority. We behave differently towards her because of the coat’s symbolism. In other words, the coat affects the perceiver’s cognition and behavior. But does it affect the wearer’s cognition?
Adam and Galinsky conducted three experiments to find out. In the first, they divided randomly selected participants into two groups, one of which wore white lab coats, the other of which did not. The two groups then performed the Stroop test in which the word “blue” is printed in red or the word “green” is printed in yellow. The groups were asked to identify incongruities between the words and colors. The group wearing white lab coats performed about twice as well as the other group.
The second test used three groups. One group wore a white lab coat and believed that it was a doctor’s coat. The second group wore an identical white lab coat but believed that it was painter’s coat. The third group wore normal street clothes. The experimenters asked the three groups to spot discrepancies in a series of illustrations. Those who wore the doctor’s coat found more discrepancies than either of the other two groups. The symbolic value of a doctor’s coat had greater impact on attention than did the painter’s coat.
The third experiment was similar to the second except that some groups didn’t wear the doctor’s or painter’s coat; they merely observed them. Those who donned the doctor’s coat performed best.
The study suggests that the symbolic nature of clothing does indeed affect our cognition. Merely observing the clothes does not trigger the effect (or does so only mildly). Actually wearing the clothes has a meaningful impact on our thinking and behavior.
These studies suggest that our clothes not only affect how others perceive us. They also affect how we perceive ourselves. Even if no one sees us, our clothes influence our cognition. Perhaps, then, we can dress for success, even if we work alone. Similarly, wearing athletic clothes may well improve our chances of getting a good workout. Dressing like a member of the clergy may make us behave more ethically. Dressing like a slob may make us behave like a slob.
There’s one other wrinkle that was brought to my attention – oddly enough – by my spellchecker. When I wrote “enclothed cognition”, the spellchecker consistently converted it to “unclothed cognition”. This raises an interesting question. If clothes affect our cognition in certain ways, does the absence of clothes affect our cognition in other ways? Time for another study.
When I went off to college, my mother told me, “Now remember … you’re going to college to learn how to think. Don’t miss that lesson.”
I wonder what she would say if she were sending me off to college today. It might be more along the lines of, “Now remember … you’re going to college to get a good job. Don’t blow it.”
We can only judge programs and processes based on their goals. If the goal of government is to provide good services at a reasonable cost, we might give it a fairly low grade. However, if the goal of government is to increase employment, then we might evaluate it more positively. The same is true of higher education. So what is the goal of higher education? Is it to teach students how to think? Or is it to provide them skills to get a job?
I would argue that the goal of higher education – indeed of any education – is to improve the students’ ability to think. Good thinking can certainly help you get a job. In fact, it may be the ultimate job skill. But good thinking can take you much farther than a good job. Here’s my thinking on the issue:
1) Thinking is foundational — the essence of running a business (or a government) is to make decisions about the future. To make effective decisions, we need to understand how we think, how our thinking can be biased, and how to evaluate evidence and arguments. If we know everything about finance, for instance, but don’t know how to think effectively, we will make decisions based on faulty evidence, weak arguments, and unconscious biases. By chance, we might still make some good decisions. But we should remember Louis Pasteur’s thought, “Chance favors the prepared mind.”
2) The future is unknowable — our niece, Amelia, will graduate from college next May. If she works until she’s 65, she’ll retire in the year 2060. What skills will employers need in 2060? Who knows? As I reflect on my own education, the content I learned in college is largely useless today. The processes I learned, however, are still very relevant. Thinking is the ultimate process. Amelia will still need to think effectively in 2060.
3) Thinking promotes freedom – if we can’t think for ourselves, we will forever be buffeted by other people’s agendas, desires, ambitions, and rhetorical excesses. Critical thinking allows us to assess ideas and social movements and make effective decisions on our own. We can frame our thinking as we wish and not allow others to create frames for us. We can identify the truth rather than relying on others to tell us what is true. Critical thinking, allows us to take control of our own destiny, which is the essence of freedom. I don’t know of any other discipline that can make the same claim.