Strategy. Innovation. Brand.

critical thinking

1 2 3 14

Critical Thinking — Ripped From The Headlines!

Just because we see a path doesn’t mean we should follow it.

 

Daniel Kahneman, the psychologist who won the Nobel prize in economics, reminds us that, “What you see is not all there is.” I thought about Kahneman when I saw the videos and coverage of the teenagers wearing MAGA hats surrounding, and apparently mocking, a Native American activist who was singing a tribal song during a march in Washington, D.C.

The media coverage essentially came in two waves. The first wave concluded that the teenagers were mocking, harassing, and threatening the activist. Here are some headlines from the first wave:

 

ABC News: “Viral video of Catholic school teens in ‘MAGA’ caps taunting Native Americans draws widespread condemnation; prompts a school investigation.”

Time Magazine: “Kentucky Teens Wearing ‘MAGA’ Hats Taunt Indigenous Peoples March Participants In Viral Video.”

Evening Standard (UK): “Outrage as teens in MAGA hats ‘mock’ Native American Vietnam War veteran.”

The second media wave provided a more nuanced view. Here are some more recent headlines:

New York Times: “Fuller Picture Emerges of Viral Video of Native American Man and Catholic Students.”

The Guardian (UK): “New video sheds more light on students’ confrontation with Native American.”

The StrangerI Thought the MAGA Boys Were S**t-Eating Monsters. Then I Watched the Full Video.”

So, who is right and who is wrong? I’m not sure that we can draw any certain conclusions. I certainly do have some opinions but they are all based on very short video clips that are taken out of context.

What lessons can we draw from this? Here are a few:

  • Reality is complicated and — even in the best circumstances — we only see a piece of it.
  • We see what we expect to see. Tell me how you voted, and I can guess what you saw.
  • It’s very hard to draw firm conclusions from a brief slice-of-time sources such as a photograph or a video clip. The Atlantic magazine has an in-depth story about how this story evolved. One key sentence: “Photos by definition capture instants of time, and remove them from the surrounding flow.”
  • There’s an old saying that “Journalism is the first draft of history”. Photojournalism is probably the first draft of the first draft. It’s often useful to wait to see how the story evolves. Slowing down a decision process usually results in a better decision.
  • It’s hard to read motives from a picture.
  • Remember that what we see is not all there is. As the Heath brothers write in their book, Decisive, move your spotlight around to avoid narrow framing.
  • Humans don’t like to be uncertain. We like to connect the dots and explain things even when we don’t have all the facts. But, sometimes, uncertainty is the best we can hope for. When you’re uncertain, remember the lessons of Appreciative Inquiry and don’t default to the negative.

 

That’s Irrelevant!

In her book, Critical Thinking: An Appeal To Reason, Peg Tittle has an interesting and useful way of organizing 15 logical fallacies. Simply put, they’re all irrelevant to the assessment of whether an argument is true or not. Using Tittle’s guidelines, we can quickly sort out what we need to pay attention to and what we can safely ignore.

Though these fallacies are irrelevant to truth, they are very relevant to persuasion. Critical thinking is about discovering the truth; it’s about the present and the past. Persuasion is about the future, where truth has yet to be established. Critical thinking helps us decide what we can be certain of. Persuasion helps us make good choices when we’re uncertain. Critical thinking is about truth; persuasion is about choice. What’s poison to one is often catnip to the other.

With that thought in mind, let’s take a look at Tittle’s 15 irrelevant fallacies. If someone tosses one of these at you in a debate, your response is simple: “That’s irrelevant.”

  1. Ad hominem: character– the person who created the argument is evil; therefore the argument is unacceptable. (Or the reverse).
  2. Ad hominem: tu quoque– the person making the argument doesn’t practice what she preaches. You say, it’s wrong to hunt but you eat meat.
  3. Ad hominem: poisoning the well— the person making the argument has something to gain. He’s not disinterested. Before you listen to my opponent, I would like to remind you that he stands to profit handsomely if you approve this proposal.
  4. Genetic fallacy– considering the origin, not the argument. The idea came to you in a dream. That’s not acceptable.
  5. Inappropriate standard: authority– An authority may claim to be an expert but experts are biased in predictable ways.
  6. Inappropriate standard: tradition – we accept something because we have traditionally accepted it. But traditions are often fabricated.
  7. Inappropriate standard: common practice – just because everybody’s doing it doesn’t make it true. Appeals to inertia and status quo.Example: most people peel bananas from the “wrong” end.
  8. Inappropriate standard: moderation/extreme – claims that an argument is true (or false) because it is too extreme (or too moderate). AKA the Fallacy of the Golden Mean.
  9. Appeal to popularity: bandwagon effect– just because an idea is popular doesn’t make it true. Often found in ads.
  10. Appeal to popularity: elites only a few can make the grade or be admitted. Many are called; few are chosen.
  11. Two wrongs –our competitors have cheated, therefore, it’s acceptable for us to cheat.
  12. Going off topic: straw man or paper tiger– misrepresenting the other side. Responding to an argument that is not the argument presented.
  13. Going off topic: red herring – a false scent that leads the argument astray. Person 1: We shouldn’t do that. Person 2: If we don’t do it, someone else will.
  14. Going off topic: non sequitir – making a statement that doesn’t follow logically from its antecedents. Person 1: We should feed the poor. Person 2: You’re a communist.
  15. Appeals to emotion– incorporating emotions to assess truth rather than using logic. You say we should kill millions of chickens to stop the avian flu. That’s disgusting.

Chances are that you’ve used some of these fallacies in a debate or argument. Indeed, you may have convinced someone to choose X rather than Y using them. Though these fallacies may be persuasive, it’s useful to remember that they have nothing to do with truth.

One-Day Seminars – Fall 2018

Wake up! It’s seminar time.

This fall, in addition to my regular academic courses, I’ll  teach three one-day seminars designed for managers and executives.

These seminars draw on my academic courses and are repackaged for professionals who want to think more clearly and persuade more effectively. They also provide continuing education credits under the auspices of the University of Denver’s Center for Professional Development.

If you’re guiding your organization into an uncertain future, you’ll find them helpful. Here are the dates and titles along with links to the registration pages.

I hope to see you in one or more of these seminars. If you’re not in the Denver area, I can also take these on the road. Just let me know of your interest.

Why Study Critical Thinking?

Friend or foe?

People often ask me why they should take a class in critical thinking. Their typical refrain is, “I already know how to think.” I find that the best answer is a story about the mistakes we often make.

So I offer up the following example, drawn from recent news, about very smart people who missed a critical clue because they were not thinking critically.

The story is about the conventional wisdom surrounding Alzheimer’s. We’ve known for years that people who have Alzheimer’s also have higher than normal deposits of beta amyloid plaques in their brains. These plaques build up over time and interfere with memory and cognitive processes.

The conventional wisdom holds that beta amyloid plaques are an aberration. The brain has essentially gone haywire and starts to attack itself. It’s a mistake. A key research question has been: how do we prevent this mistake from happening? It’s a difficult question to answer because we have no idea what triggered the mistake.

But recent research, led by Rudolph Tanzi and Robert Moir, considers the opposite question. What if the buildup of beta amyloid plaques is not a mistake? What if it serves some useful purpose? (Click here and here for background articles).

Pursuing this line of reasoning, Tanzi and Moir discovered the beta amyloid is actually an antimicrobial substance. It has a beneficial purpose: to attack bacteria and viruses and smother them. It’s not a mistake; it’s a defense mechanism.

Other Alzheimer’s researchers have described themselves as “gobsmacked” and “surprised” by the discovery. One said, “I never thought about it as a possibility.”

A student of critical thinking might ask, Why didn’t they think about this sooner? A key tenet of critical thinking is that one should always ask the opposite question. If conventional wisdom holds that X is true, a critical thinker would automatically ask, Is it possible that the opposite of X is true in some way?

Asking the opposite question is a simple way to identify, clarify, and check our assumptions. When the conventional wisdom is correct, it leads to a dead end. But, occasionally, asking the opposite question can lead to a Nobel Prize. Consider the case of Barry Marshall.

A doctor in Perth, Australia, Marshall was concerned about his patients’ stomach ulcers. Conventional wisdom held that bacteria couldn’t possibly live in the gastric juices of the human gut. So bacteria couldn’t possibly cause ulcers. More likely, stress and anxiety were the culprits. But Marshall asked the opposite question and discovered the bacteria now known a H. Pylori. Stress doesn’t cause ulcer, bacteria do. For asking the opposite question — and answering it — Marshall won the Nobel Prize in Medicine in 2005.

The discipline of critical thinking gives us a structure and method – almost a checklist – for how to think through complex problems. We should always ask the opposite question. We should be aware of common fallacies and cognitive biases. We should understand the basics of logic and argumentation. We should ask simple, blunt questions. We should check our egos at the door. If we do all this – and more – we tilt the odds in our favor. We prepare our minds systematically and open them to new possibilities – perhaps even the possibility of curing Alzheimer’s. That’s a good reason to study critical thinking.

What We Don’t Know and Don’t See

We have differing assumptions.

It’s hard to think critically when you don’t know what you’re missing. As we think about improving our thinking, we need to account for two things that are so subtle that we don’t fully recognize them:

  • Assumptions – we make so many assumptions about the world around us that we can’t possibly keep track of them all. We make assumptions about the way the world works, about who we are and how we fit, and about the right way to do things (the way we’ve always done them). A key tenet of critical thinking is that we should question our own thinking. But it’s hard to question our assumptions if we don’t even realize that we’re making assumptions.
  • Sensory filters – our eye is bombarded with millions of images each second. But our brain can only process roughly a dozen images per second. We filter out everything else. We filter out enormous amounts of visual data, but we also filter information that comes through our other senses – sounds, smells, etc. In other words, we’re not getting a complete picture. How can we think critically when we don’t know what we’re not seeing? Additionally, my picture of reality differs from your picture of reality (or anyone else’s for that matter). How can we communicate effectively when we don’t have the same pictures in our heads?

Because of assumptions and filters, we often talk past each other. The world is a confusing place and becomes even more confusing when our perception of what’s “out there” is unique. How can we overcome these effects? We need to consider two sets of questions:

  • How can we identify the assumptions were making? – I find that the best method is to compare notes with other people, especially people who differ from me in some way. Perhaps they work in a different industry or come from a different country or belong to a different political party. As we discuss what we perceive, we can start to see our own assumptions. Additionally we can think about our assumptions that have changed over time. Why did we used to assume X but now assume Y? How did we arrive at X in the first place? What’s changed to move us toward Y? Did external reality change or did we change?
  • How can we identify what we’re not seeing (or hearing, etc.)? – This is a hard problem to solve. We’ve learned to filter information over our entire lifetimes. We don’t know what we don’t see. Here are two suggestions:
    • Make the effort to see something new – let’s say that you drive the same route to work every day. Tomorrow, when you drive the route, make an effort to see something that you’ve never seen before. What is it? Why do you think you missed it before? Does the thing you missed belong to a category? Are you missing the entire category? Here’s an example: I tend not to see houseplants. My wife tends not to see classic cars. Go figure.
    • View a scene with a friend or loved one. Write down what you see. Ask the other person to do the same. What are the differences? Why do they exist?

The more we study assumptions and filters, the more attuned we become to their prevalence. When we make a decision, we’ll remember to inquire abut ourselves before we inquire about the world around us. That will lead us to better decisions.

1 2 3 14
My Social Media

YouTube Twitter Facebook LinkedIn

Newsletter Signup
Archives