Strategy. Innovation. Brand.

critical thinking

Seldom Right. Never In Doubt.

I'm never wrong. About anything.

I’m never wrong. About anything.

Since I began teaching critical thinking four years ago, I’ve bought a lot of books on the subject. The other day, I wondered how many of those books I’ve actually read.

So, I made two piles on the floor. In one pile, I stacked all the books that I have read (some more than once). In the other pile, I stacked the books that I haven’t read.

Guess what? The unread stack is about twice as high as the other stack. In other words, I’ve read about a third of the books I’ve acquired on critical thinking and have yet to read about two-thirds.

What can I conclude from this? My first thought: I need to take a vacation and do a lot of reading. My second thought: Maybe I shouldn’t mention this to the Dean.

I also wondered, how much do I not know? Do I really know only a third of what there is to know about the topic? Maybe I know more since there’s bound to be some repetition in the books. Or maybe I know less since my modest collection may not cover the entire topic. Hmmm…

The point is that I’m thinking about what I don’t know rather than what I do know. That instills in me a certain amount of doubt. When I make assertions about critical thinking, I add cautionary words like perhaps or maybe or the evidence suggests. I leave myself room to change my position as new knowledge emerges (or as I acquire knowledge that’s new to me).

I suspect that the world might be better off if we all spent more time thinking about what we don’t know. And it’s not just me. The Dunning-Kruger effect states essentially the same thing.

David Dunning and Justin Kruger, both at Cornell, study cognitive biases. In their studies, they documented a bias that we now call illusory superiority. Simply put, we overestimate our own abilities and skills compared to others. More specifically, the less we know about a given topic, the more likely we are to overestimate our abilities. In other words, the less we know, the more confident we are in our opinions. As David Dunning succinctly puts it, “…incompetent people do not recognize—scratch that, cannot recognize—just how incompetent they are.”

The opposite seems to be true as well. Highly competent people tend to underestimate their competence relative to others. The thinking goes like this: If it’s easy for me, it must be easy for others as well. I’m not so special.

I’ve found that I can use the Dunning-Kruger effect as a rough-and-ready test of credibility. If a source provides a small amount of information with a high degree of confidence, then their credibility declines in my estimation. On the other hand, if the source provides a lot of information with some degree of doubt, then their credibility rises. It’s the difference between recognizing a wise person and a fool.

Perhaps we can use the same concept to greater effect in our teaching. When we learn about a topic, we implicitly learn about what we don’t know. Maybe we should make it more explicit. Maybe we should count the books we’ve read and the books we haven’t read to make it very clear just how much we don’t know. If we were less certain of our opinions, we would be more open to other people and intriguing new ideas. That can’t be a bad thing.

Have We Got It All Backwards?

Don't shoot!

Don’t shoot!

The concept of cause-and-effect is very slippery. We think that A causes B only to find that C really causes both A and B. Or, perhaps it’s really B that causes A. More subtly, A might influence B which turns right around and influences A.

Lately, I’ve been thinking that we’ve been looking at a lot of things from the wrong end of the telescope. Some examples:

Our brain creates us – what creates our personality and the essence of who we are? Why our brains, of course. My brain is the cause; my personality is the effect. Further, the brain is what it is; there’s not much we can do about it. Well…not so fast. Maybe we got it backwards. It turns out that the brain is plastic; we can change it through our habits, actions, and thoughts. In many ways, we create our brains rather than the other way round. Norman Doidge is a leading writer on brain plasticity. You can find his books here and here.

Mutate first; adapt later – our general model of evolution suggests that random mutations happen in our DNA. Mutations that provide a competitive edge are then preserved and passed on. Mutations that aren’t so helpful just fade away. But, according to a recent article in New Scientist, we may have it backwards. Again, plasticity is a key concept. “A growing number of biologists think … plasticity may also play a key role in evolution. Instead of mutating first and adapting later, they argue, animals often adapt first and mutate later.”

I am the master of my fate – I used to believe that I was in control. Now I realize that my System 1 often makes decisions without any input from “me”. Indeed, I don’t even know the decisions are being made. But it’s not just my “primitive brain” that molds my behavior. It’s also how fast my heart beats and how healthy my vagus nerve is. But it’s not even just my body that steers me. It’s also the microbes in my gut. When the microbes team up, they can make me do bizarre things – like eating chocolate. They may even contribute to schizophrenia.

OCD starts with thoughts – we’ve always assumed that irrational thoughts create obsessive compulsive disorder. Irrational thoughts begin in the brain and radiate outward to produce irrational behavior. But, as Clare Gillan points out, we may have it backwards. When she induced new habits in volunteers, she found that people with OCD change their beliefs to explain the new habit. In other words, behavior is the cause and belief is the effect.

The gardener manages the garden – Suellen loves to garden and will spend hours at hard labor under a hot sun. When I see how hard she works, I wonder if she’s managing the flowers or if they’re managing her. It’s not a new thought. The Botany of Desire makes the same point.

What else have we gotten backwards? It’s hard to know. But, as the Heath brothers point out in Decisive, if you believe A causes B, you owe it to yourself to consider the opposite.

Happiness and High Dudgeon

I'm always happy!

I’m always happy!

Some years ago, I discovered that I could improve my mood (and maybe my performance) simply by forcing myself to smile. I knew that being happy made me smile. I wondered if smiling could make me happy. Happily, it could.

As we’ve discussed before (here and here), the body and the brain are one system. The brain affects the body and the body returns the favor. Your posture affects your thoughts and your mood. According to Amy Cuddy, it can even help you get a job.

In fact, you don’t even have to smile. Just hold a pencil in your mouth sideways. Your smile muscles will flex and your mood will lift. Even though you know you’re tricking yourself, it actually works. Indeed, it’s foolproof.

When I discovered all this, I thought, “Great. I can always be happy. I’ll never be cranky or curmudgeonly again. I’ll always be in a good mood.”

But something funny happened on the way to happiness nirvana. I discovered that, on many occasions, I simply didn’t want to smile. I found that I actually enjoyed being cranky, snarky, and even a tad self-righteous.

This was a revelation. I’m a reasonably positive person and I always assumed that – if I had the choice – I would opt for happiness rather than its opposite. But I do have a choice and I find that I don’t always exercise it.

The dilemma seems to be the difference between being right and being happy. When I know I’m right – and somebody else is wrong – I can work myself into high dudgeon (as my mother called it). It’s a sense of self-righteous anger. I know I’m right and I want to prove the other side wrong. I’m indignant. I want to expose them for what they are – craven fools. My self-righteousness fuels the fire. I have no time to be happy. I’m on a mission.

I also enjoy winning, whether it’s an argument or a race or a poker hand. In a roundabout way, then, high dudgeon can lead to happiness. My indignation provides the energy and determination that can power me to victory. And victory makes me happy.

Getting angry to get happy, however, is a very odd dynamic. You’re going the wrong way. And winning one victory can simply lead to another battle. If there’s a victor, there’s also a vanquished. And he wants to get even.

I promised to write about happiness. First, I want to explore this conundrum of happiness and high dudgeon. Think about it. Do you actually want to be happy? Or, put another way, if you had to choose, would you rather be happy or right?

The Mother Of All Fallacies

An old script, it is.

An old script, it is.

How are Fox News and Michael Moore alike?

They both use the same script.

Michael Moore comes at issues from the left. Fox News comes from the right. Though they come from different points on the political spectrum, they tell the same story.

In rhetoric, it’s called the Good versus Evil narrative. It’s very simple. On one side we have good people. On the other side, we have evil people. There’s nothing in between. The evil people are cheating or robbing or killing or screwing the good people. The world would be a better place if we could only eliminate or neuter or negate or kill the evil people.

We’ve been using the Good versus Evil narrative since we began telling stories. Egyptian and Mayan hieroglyphics follow the script. So do many of the stories in the Bible. So do Republicans. So do Democrats. So do I, for that matter. It’s the mother of all fallacies.

The narrative inflames the passions and dulls the senses. It makes us angry. It makes us feel that outrage is righteous and proper. The narrative clouds our thinking. Indeed, it aims to stop us from thinking altogether. How can we think when evil is abroad? We need to act. We can think later.

I became sensitized to the Good versus Evil narrative when I lived in Latin America. I met more than a few people who are convinced that America is the embodiment of evil. They see it as a country filled with greedy, immoral thieves and murderers who are sucking the blood of the innocent and good people of Latin America. I had a difficult time squaring this with my own experiences. Perhaps the narrative is wrong.

Rhetoric teaches us to be suspicious when we hear Good versus Evil stories. The word is a messy, chaotic, and random place. Actions are nuanced and ambiguous. People do good things for bad reasons and bad things for good reasons. A simple narrative can’t possibly capture all the nuances and uncertainties of the real world. Indeed, the Good versus Evil doesn’t even try. It aims to tell us what to think and ensure that we never, ever think for ourselves.

When Jimmy Carter was elected president, John Wayne attended his inaugural even though he had supported Carter’s opponent. Wayne gave a gracious speech. “Mr. President”, he said, “you know that I’m a member of the loyal opposition. But let’s remember that the accent is on ‘loyal’”. How I would love to hear anyone say that today. It’s the antithesis of Good versus Evil.

Voltaire wrote that, “Anyone who has the power to make you believe absurdities has the power to make you commit injustices.” The Good versus Evil narrative is absurd. It doesn’t explain the world; it inflames the world. Ultimately, it can make injustices seem acceptable.

The next time you hear a Good versus Evil story, grab your thinking cap. You’re going to need it.

(By the way, Tyler Cowen has a terrific TED talk on this topic that helped crystallize my thinking. You can find it here.)

Fallacy of Fallacies

It's true!

It’s true!

Let’s talk about logic for a moment. When you hear the word argument, you may think of a heated exchange of opinions. It’s emotional and angry. A logician would call this a quarrel rather than an argument. In the world of logic, an argument means that you offer reasons to support a conclusion.

An argument can be valid or invalid and sound or unsound. Here’s an example of an argument in a classic form:

Premise 1:      All women have freckles.

Premise 2:      Suellen is a woman.

Conclusion:     Suellen has freckles.

We have two reasons that lead us to a conclusion. In other words, it’s an argument. Is it a good argument? Well, that’s a different question.

Let’s look first at validity. An argument is valid if the conclusion flows logically from the premises. In this case, we have a major premise and a minor premise and – if they are true – the conclusion is inescapable. Suellen must have freckles. The conclusion flows logically from the premises. The argument is valid.

But is the argument sound? An argument is sound if the premises are verifiably true. The second premise is verifiably true – Suellen is indeed a woman. But the first premise is not verifiably true. All we have to do is look around. We’ll quickly realize that the first premise is false – not all women have freckles.

So, the argument is valid but unsound. One of the premises that leads to the conclusion is false. Can we safely assume, then, that the conclusion is also false? Not so fast, bub.

This is what’s known as the fallacy of fallacies. We often assume that, if there’s a fallacy in an argument, then the conclusion must necessarily be false. Not so. It means the conclusion is not proven. The fact that something is not proven doesn’t necessarily mean that it’s false. (Indeed, in technical terms, we’ve never proven that smoking causes cancer in humans).

Our example demonstrates the fallacy of fallacies. We agree that the argument is valid but not sound. One of the premises is false. Yet, if you know Suellen, you know that the conclusion is true. She does indeed have freckles. So even an unsound (or invalid) argument can result in a conclusion that’s true.

What’s the moral here? There’s a big difference between not proven and not true. Something that’s not proven may well be true. That’s when you want to consider Pascal’s Wager.

My Social Media

YouTube Twitter Facebook LinkedIn

Newsletter Signup
Archives