The 1989 Tour de France was decided in the last stage, a 15.2 mile time trial into Paris. The leader, Laurent Fignon, held a fifty second advantage over Greg LeMond. Both riders were strong time trialers. To make up fifty seconds in such a short race seemed impossible. Most observers assumed that Fignon would hold his lead and win the overall title.
In most time trials, coaches radio the riders to inform them of their speed, splits, and competitive position. In this final time trial, however, LeMond turned off his radio. He didn’t want to know. He feared that, if he knew too much, he might ease up. Instead, he raced flat out for the entire distance, averaging 33.9 miles per hour, a record at the time. In a stunning finish, LeMond gained 58 seconds on Fignon and won the race by a scant eight seconds. (Here’s a terrific video recap of the final stage).
LeMond’s strategy is today known as information avoidance. He chose not to accept information that he knew was freely available to him. LeMond knew that he might be distracted by the information. He chose instead to focus solely on his own performance – the only variable that he could control.
While information avoidance worked for LeMond, the strategy often yields suboptimal outcomes. We choose not to know something and the not knowing creates health hazards, financial obstacles, and a series of unfortunate events. Here are some examples.
In some ways, information avoidance is the flip side of the confirmation bias. We accept information that confirms our beliefs and avoid information that doesn’t. But there seems to be more to avoidance than simply the desire to avoid disconfirming information. Other contributors include:
Information avoidance can also teach us about persuasion. If we want to persuade people to change their opinion about something, making it scarier is probably self-defeating. People will be more likely to avoid the information rather than seeking it out. Similarly, bombarding people with more and more information is likely to be counter-productive. People under bombardment become defensive rather than open-minded.
As Aristotle noted, persuasion consists of three facets: 1) ethos (credibility); 2) pathos (emotional connection); 3) logos (logic and information). Today, we often seek to persuade with logos – information and logic. But Aristotle taught that logos is the least persuasive facet. We typically use logos to justify a decision rather than to make a decision. Ethos and pathos are much more influential in making the decision. The recent research on information avoidance suggests that we’ll persuade more people with ethos and pathos than we ever will with logos. Aristotle was right.
Greg LeMond’s example shows that information avoidance can provide important benefits. But, as we develop our communication strategies, let’s keep the downsides in mind. We need to package our arguments in ways that will reduce information avoidance and lead to a healthier exchange of ideas.
My friend, Andy, once taught in the Semester at Sea program. The program has an ocean-going ship and brings undergraduates together for a semester of sea travel, classes, and port calls. Andy told me that he was fascinated watching these youngsters come together and form in-groups and out-groups. The cliques were fairly stable while the ship was at sea but more fluid when it was in port.
Andy told me, for instance, that some of the women described some of the men as “Ship cute, shore ugly.” The very concept of “cute” was flexible and depended entirely on context. When at sea, a limited supply of men caused the “cute” definition to expand. In port, with more men available, the definition of cute became more stringent.
We usually think of concepts as more-or-less fixed. They’re unlike other processes that expand over time. The military, for instance is familiar with “mission creep” – a mission may start with small and well-defined objectives but they often grow over time. Similarly, software developers understand “feature creep” – new features are added as the software is developed. But do concepts creep? The Semester at Sea example suggests that they do, depending on prevalence.
This was also the finding of a research paper published in a recent issue of Science magazine. (Click here). Led by David Levari, the researchers showed that “… people often respond to decreases in the prevalence of a stimulus by expanding their concept of it.” In the Semester at Sea example, as the stimulus (men) decreases, the concept of cute expands. According to Levari, et. al., this is a common phenomenon and not just related to hormonal youngsters isolated on a ship.
The researchers started with a very neutral stimulus – the color of dots. They presented 1,000 dots ranging in color from purple to blue and asked participants to identify the blue ones. They repeated the trial several hundred times. Participants were remarkably consistent in each trial. Dots identified as blue in the first trials were still identified as blue in the last trials.
The researchers then repeated the trials while reducing the number of blue dots. Would participants in the second set of trials – with decreased stimulus — expand their definition of “blue” and identify dots as blue that they had originally identified as purple? Indeed, they would. In fact, the number of purple-to-blue “crossover” dots was remarkably consistent through numerous trials.
The researchers also varied the instructions for the comparisons. In the first study, participants were told that the number of blue dots “might change” in the second pass. In a second study, participants were told that the number of blue dots would “definitely decrease.” In a third study, participants were instructed to “be consistent” and were offered monetary rewards for doing so. In some studies the number of blue dots declined gradually. In others, the blue dots decreased abruptly. These procedural changes had virtually no impact on the results. In all cases, declining numbers of blue dots resulted in an expanded definition of “blue”.
Does concept creep extend beyond dots? The researchers did similar trials with 800 images of human faces that had been rated on a continuum from “very threatening” to “not very threatening.” The results were essentially the same as the dot studies. When the researchers reduced the number of threatening faces, participants expanded their definition of “threatening.”
All these tests used visual stimuli. Does concept creep also apply to nonvisual stimuli? To test this, the researchers asked participants to evaluate whether 240 research proposals were ethical or not. The results were essentially the same. When the participants saw many unethical proposals, their definition of ethics was fairly stringent. When they saw fewer unethical proposals, their definition expanded.
It seems then that “prevalence-induced concept change” – as the researchers label it – is probably common in human behavior. Could this help explain some of the pessimism in today’s world? For example, numerous sources verify that the crime rate in the United States has declined over the past two decade. (See here, here, and here, for example). Yet many people believe that the crime rate has soared. Could concept creep be part of the problem? It certainly seems likely.
Yet again, our perception of reality differs from actual reality. Like other cognitive biases, concept creep distorts our perception in predictable ways. As the number of stimuli – from cute to blue to ethical – goes down, we expand our definition of what the concept actually means. As “bad news” decreases, we expand our definition of what is “bad”. No wonder we’re pessimistic.
Daniel Kahneman is rightly respected for discovering and documenting any number of irrational human behaviors. Prospect theory – developed by Kahneman and his colleague, Amos Tversky – has led to profound new insights in how we think, behave, and spend our money. Indeed, there’s a straight line from Kahneman and Tversky to the new discipline called Behavioral Economics.
In my humble opinion, however, one of Kahneman’s innovations has been overlooked. The innovation doesn’t have an agreed-upon name so I’m proposing that we call it the Kahneman Interview Technique or KIT.
The idea behind KIT is fairly simple. We all know about the confirmation bias – the tendency to attend to information that confirms what we already believe and to ignore information that doesn’t. Kahneman’s insight is that confirmation bias distorts job interviews.
Here’s how it works. When we meet a candidate for a job, we immediately form an impression. The distortion occurs because this first impression colors the rest of the interview. Our intuition might tell us, for instance, that the candidate is action-oriented. For the rest of the interview, we attend to clues that confirm this intuition and ignore those that don’t. Ultimately, we base our evaluation on our initial impressions and intuition, which may be sketchy at best. The result – as Google found – is that there is no relationship between an interviewer’s evaluation and a candidate’s actual performance.
To remove the distortion of our confirmation bias, KIT asks us to delay our intuition. How can we delay intuition? By focusing first on facts and figures. For any job, there are prerequisites for success that we can measure by asking factual questions. For instance, a salesperson might need to be: 1) well spoken; 2) observant; 3) technically proficient, and so on. An executive might need to be: 1) a critical thinker; 2) a good strategist; 3) a good talent finder, etc.
Before the interview, we prepare factual questions that probe these prerequisites. We begin the interview with facts and develop a score for each prerequisite – typically on a simple scale like 1 – 5. The idea is not to record what the interviewer thinks but rather to record what the candidate has actually done. This portion of the interview is based on facts, not perceptions.
Once we have a score for each dimension, we can take the interview in more qualitative directions. We can ask broader questions about the candidate’s worldview and philosophy. We can invite our intuition to enter the process. At the end of the process, Kahneman suggests that the interviewer close her eyes, reflect for a moment, and answer the question, How well would this candidate do in this particular job?
Kahneman and other researchers have found that the factual scores are much better predictors of success than traditional interviews. Interestingly, the concluding global evaluation is also a strong predictor, especially when compared with “first impression” predictions. In other words, delayed intuition is better at predicting job success than immediate intuition. It’s a good idea to keep in mind the next time you hire someone.
I first learned about the Kahneman Interview Technique several years ago when I read Kahneman’s book, Thinking Fast And Slow. But the book is filled with so many good ideas that I forgot about the interviews. I was reminded of them recently when I listened to the 100th episode of the podcast, Hidden Brain, which features an interview with Kahneman. This article draws on both sources.
People often ask me why they should take a class in critical thinking. Their typical refrain is, “I already know how to think.” I find that the best answer is a story about the mistakes we often make.
So I offer up the following example, drawn from recent news, about very smart people who missed a critical clue because they were not thinking critically.
The story is about the conventional wisdom surrounding Alzheimer’s. We’ve known for years that people who have Alzheimer’s also have higher than normal deposits of beta amyloid plaques in their brains. These plaques build up over time and interfere with memory and cognitive processes.
The conventional wisdom holds that beta amyloid plaques are an aberration. The brain has essentially gone haywire and starts to attack itself. It’s a mistake. A key research question has been: how do we prevent this mistake from happening? It’s a difficult question to answer because we have no idea what triggered the mistake.
But recent research, led by Rudolph Tanzi and Robert Moir, considers the opposite question. What if the buildup of beta amyloid plaques is not a mistake? What if it serves some useful purpose? (Click here and here for background articles).
Pursuing this line of reasoning, Tanzi and Moir discovered the beta amyloid is actually an antimicrobial substance. It has a beneficial purpose: to attack bacteria and viruses and smother them. It’s not a mistake; it’s a defense mechanism.
Other Alzheimer’s researchers have described themselves as “gobsmacked” and “surprised” by the discovery. One said, “I never thought about it as a possibility.”
A student of critical thinking might ask, Why didn’t they think about this sooner? A key tenet of critical thinking is that one should always ask the opposite question. If conventional wisdom holds that X is true, a critical thinker would automatically ask, Is it possible that the opposite of X is true in some way?
Asking the opposite question is a simple way to identify, clarify, and check our assumptions. When the conventional wisdom is correct, it leads to a dead end. But, occasionally, asking the opposite question can lead to a Nobel Prize. Consider the case of Barry Marshall.
A doctor in Perth, Australia, Marshall was concerned about his patients’ stomach ulcers. Conventional wisdom held that bacteria couldn’t possibly live in the gastric juices of the human gut. So bacteria couldn’t possibly cause ulcers. More likely, stress and anxiety were the culprits. But Marshall asked the opposite question and discovered the bacteria now known a H. Pylori. Stress doesn’t cause ulcer, bacteria do. For asking the opposite question — and answering it — Marshall won the Nobel Prize in Medicine in 2005.
The discipline of critical thinking gives us a structure and method – almost a checklist – for how to think through complex problems. We should always ask the opposite question. We should be aware of common fallacies and cognitive biases. We should understand the basics of logic and argumentation. We should ask simple, blunt questions. We should check our egos at the door. If we do all this – and more – we tilt the odds in our favor. We prepare our minds systematically and open them to new possibilities – perhaps even the possibility of curing Alzheimer’s. That’s a good reason to study critical thinking.
A few days ago, the RAND Corporation — one of America’s oldest think tanks — published a report titled, Truth Decay: A Threat To Policymaking and Democracy. Though I’ve read it only once and probably don’t yet grasp all its nuances, I think it’s very relevant to our world today and want to bring it to your attention.
You can find the full report here. Here are some of the highlights. The items in italics are direct quotes. The items not in italics are my comments and opinions.
What Is Truth Decay?
Heightened disagreement about facts and analytical interpretations of data — we used to disagree about opinions. Now we increasingly disagree about facts.
The blurred line between opinion and fact — we used to separate “news” from “editorial”. It was easy to tell which was which. Today, the two are commonly mixed together.
Increased volume and influence of opinion and personal experience across the communication landscape — our channels of mass communication used to be dominated by factual reporting with some clearly labeled opinion pieces mixed in. Today the reverse seems to be true.
Diminished trust in formerly respected institutions as sources of factual information — we used to have Walter Cronkite. Now we don’t.
Why Has The Truth Decayed?
Characteristics of human information processing, such as cognitive biases — these are the same biases that we’ve been studying this quarter.
Changes in the information system, such as the rise of 24-hour news coverage, social media, and dissemination of disinformation and misleading or biased information — we used to sip information slowly. Now we’re swamped by it.
Competing demands on the educational system that challenge its ability to keep pace with information system changes — is our educational system teaching people what to think or how to think?
Polarization in politics, society, and the economy — we’ve sorted ourselves out so that we only have to meet and interact with people — and ideas — that we agree with.
It’s a bracing read and I recommend it to you.