To answer the question, you’ll need to do a fair amount of research. You might dig through police reports, census data, city government publications, and so on. It’s a lot of work.
But our brains don’t like to work. As Daniel Kahneman writes, “Thinking is to humans as swimming is to cats. They can do it, but they prefer not to.”
So, instead of answering the original question, we substitute a simpler question: How much crime can I remember in my neighborhood?
If we can remember a lot of crime – if it’s top of mind — we’ll guess that our neighborhood has a high crime rate. If we can’t remember much crime, we’ll guess that we have a low crime rate. We use our memory as a proxy for reality. It’s simple and probably not wholly wrong. It’s good enough.
Let me ask you another simple question: How dangerous is coronavirus?
It’s a tough question. We can’t possibly know the “right” answer. Even the experts can’t figure it out. So, how does our mind work on a tough question like this?
First, we use our memory as a proxy for reality. How top of mind is coronavirus? How available is it to our memory? (This, as you might guess, is known as the availability bias). Our media is saturated with stories about coronavirus. We see it every day. It’s easy to recall from memory. Must be a big deal.
Second, the media will continue to focus on coronavirus for several more months (at least). In the beginning, the media focused on the disease itself. Now, the media is more likely to focus on secondary effects – travel restrictions, quarantines, etc. Soon, the media will focus on reactions to the virus. Protesters will march on Washington demanding immediate action to protect us. The media will cover it.
The media activity is known as an availability cascade. The story keeps cascading into new stories and new angles on the same old story. The cascade keeps the story top of mind. It remains easily available to us. When was the last time we had a huge availability cascade? Think back to 2014 and the Ebola crisis. Sound familiar?
Third, our minds will consider how vivid the information is. How scary is it? How creepy? We remember vicious or horrific crimes much better than we remember mundane crimes like Saturday night stickups. How vivid is coronavirus? We see pictures everyday of workers in hazmat suits. It’s vivid.
Fourth, what are other people doing? When we don’t know how to act in a given situation, we look for cues from our fellow humans. What do we see today? Pictures of empty streets and convention centers. We read that Chinatown in New York is empty of tourists. People are afraid. If they’re afraid, we probably should be, too.
Fifth, how novel is the situation? We’re much more afraid of devils we don’t know than of devils that we do know. The coronavirus – like the Ebola virus before it – is new and, therefore, unknowable. Health experts can reassure us but, deep in our heart of hearts, we know that nobody knows. We can easily imagine that it’s the worst-case scenario. It could be the end of life as we know it.
Sixth, is it controllable? We want to think that we can control the world around us. We study history because we think that knowing the past will help us control the future. If something scary is out of our control, we will spare no expense to bring it back under control. Even a small scare – like the Three Mile Island incident – can produce a huge reaction. At times, it seems that the cure may be worse than the disease.
What to do? First, let’s apply some contextual thinking – both current and historical.
So, what to do? You’re much more likely to succumb to plain old ordinary flu than you are to be infected by coronavirus. So, get a flu shot. Then do what the old British posters from World War II told us all to do: Keep calm and carry on.
I recently saw an ad for Progressive Insurance that says, “Drivers who save with Progressive, save $796 on average.”
Now I like Progressive. And I love Flo. So, I’m sure that the statement is true. I’m sure it’s based on fact.
But it also entails a logical fallacy. If you don’t spot the fallacy, you may easily assume that the average savings for all drivers who switch to Progressive is $796. That would be a mistake.
This is a good example of the survivorship fallacy. We only examine cases that “survive” a certain threshold. In this case, the threshold is drivers who save. What about drivers who didn’t save?
Let’s say that we have 1,000 drivers who saved money. In fact, they saved a total of $796,000. On average, they saved $796 each.
Now let’s say that another 1,000 drivers saved nothing. Now we have 2,000 drivers who saved a total of $796,000. On average, they saved $398 each.
When we consider those people (or cases) that didn’t survive the threshold, the numbers change dramatically. You might hear an investment company say, “Investors who have stayed with us for ten years, made an average of 7.3% per year.” The threshold is stayed with us for ten years. Your question should be, “Well, what about those who didn’t stay for ten years?”
The survivorship fallacy doesn’t just affect numbers; it also affects qualities. Let’s say a prominent management journal publishes an article that proclaims, “The Ten Most Innovative Companies In The World Do These Three Things.” The threshold for selection is the ten most innovative companies (however that is measured). It’s quite possible that many other companies do the same three things but aren’t nearly as innovative. Since they didn’t survive the selection criterion, however, we don’t consider them.
What’s the moral? When you see an ad, put your critical thinking cap on. You’re going to need it.
Daniel Kahneman, the psychologist who won the Nobel prize in economics, reminds us that, “What you see is not all there is.” I thought about Kahneman when I saw the videos and coverage of the teenagers wearing MAGA hats surrounding, and apparently mocking, a Native American activist who was singing a tribal song during a march in Washington, D.C.
The media coverage essentially came in two waves. The first wave concluded that the teenagers were mocking, harassing, and threatening the activist. Here are some headlines from the first wave:
ABC News: “Viral video of Catholic school teens in ‘MAGA’ caps taunting Native Americans draws widespread condemnation; prompts a school investigation.”
Time Magazine: “Kentucky Teens Wearing ‘MAGA’ Hats Taunt Indigenous Peoples March Participants In Viral Video.”
Evening Standard (UK): “Outrage as teens in MAGA hats ‘mock’ Native American Vietnam War veteran.”
The second media wave provided a more nuanced view. Here are some more recent headlines:
New York Times: “Fuller Picture Emerges of Viral Video of Native American Man and Catholic Students.”
The Guardian (UK): “New video sheds more light on students’ confrontation with Native American.”
The Stranger: “I Thought the MAGA Boys Were S**t-Eating Monsters. Then I Watched the Full Video.”
So, who is right and who is wrong? I’m not sure that we can draw any certain conclusions. I certainly do have some opinions but they are all based on very short video clips that are taken out of context.
What lessons can we draw from this? Here are a few:
In her book, Critical Thinking: An Appeal To Reason, Peg Tittle has an interesting and useful way of organizing 15 logical fallacies. Simply put, they’re all irrelevant to the assessment of whether an argument is true or not. Using Tittle’s guidelines, we can quickly sort out what we need to pay attention to and what we can safely ignore.
Though these fallacies are irrelevant to truth, they are very relevant to persuasion. Critical thinking is about discovering the truth; it’s about the present and the past. Persuasion is about the future, where truth has yet to be established. Critical thinking helps us decide what we can be certain of. Persuasion helps us make good choices when we’re uncertain. Critical thinking is about truth; persuasion is about choice. What’s poison to one is often catnip to the other.
With that thought in mind, let’s take a look at Tittle’s 15 irrelevant fallacies. If someone tosses one of these at you in a debate, your response is simple: “That’s irrelevant.”
Chances are that you’ve used some of these fallacies in a debate or argument. Indeed, you may have convinced someone to choose X rather than Y using them. Though these fallacies may be persuasive, it’s useful to remember that they have nothing to do with truth.
This fall, in addition to my regular academic courses, I’ll teach three one-day seminars designed for managers and executives.
These seminars draw on my academic courses and are repackaged for professionals who want to think more clearly and persuade more effectively. They also provide continuing education credits under the auspices of the University of Denver’s Center for Professional Development.
If you’re guiding your organization into an uncertain future, you’ll find them helpful. Here are the dates and titles along with links to the registration pages.
I hope to see you in one or more of these seminars. If you’re not in the Denver area, I can also take these on the road. Just let me know of your interest.