Strategy. Innovation. Brand.

Dunning-Kruger effect

Seldom Right. Never In Doubt.

I'm never wrong. About anything.

I’m never wrong. About anything.

Since I began teaching critical thinking four years ago, I’ve bought a lot of books on the subject. The other day, I wondered how many of those books I’ve actually read.

So, I made two piles on the floor. In one pile, I stacked all the books that I have read (some more than once). In the other pile, I stacked the books that I haven’t read.

Guess what? The unread stack is about twice as high as the other stack. In other words, I’ve read about a third of the books I’ve acquired on critical thinking and have yet to read about two-thirds.

What can I conclude from this? My first thought: I need to take a vacation and do a lot of reading. My second thought: Maybe I shouldn’t mention this to the Dean.

I also wondered, how much do I not know? Do I really know only a third of what there is to know about the topic? Maybe I know more since there’s bound to be some repetition in the books. Or maybe I know less since my modest collection may not cover the entire topic. Hmmm…

The point is that I’m thinking about what I don’t know rather than what I do know. That instills in me a certain amount of doubt. When I make assertions about critical thinking, I add cautionary words like perhaps or maybe or the evidence suggests. I leave myself room to change my position as new knowledge emerges (or as I acquire knowledge that’s new to me).

I suspect that the world might be better off if we all spent more time thinking about what we don’t know. And it’s not just me. The Dunning-Kruger effect states essentially the same thing.

David Dunning and Justin Kruger, both at Cornell, study cognitive biases. In their studies, they documented a bias that we now call illusory superiority. Simply put, we overestimate our own abilities and skills compared to others. More specifically, the less we know about a given topic, the more likely we are to overestimate our abilities. In other words, the less we know, the more confident we are in our opinions. As David Dunning succinctly puts it, “…incompetent people do not recognize—scratch that, cannot recognize—just how incompetent they are.”

The opposite seems to be true as well. Highly competent people tend to underestimate their competence relative to others. The thinking goes like this: If it’s easy for me, it must be easy for others as well. I’m not so special.

I’ve found that I can use the Dunning-Kruger effect as a rough-and-ready test of credibility. If a source provides a small amount of information with a high degree of confidence, then their credibility declines in my estimation. On the other hand, if the source provides a lot of information with some degree of doubt, then their credibility rises. It’s the difference between recognizing a wise person and a fool.

Perhaps we can use the same concept to greater effect in our teaching. When we learn about a topic, we implicitly learn about what we don’t know. Maybe we should make it more explicit. Maybe we should count the books we’ve read and the books we haven’t read to make it very clear just how much we don’t know. If we were less certain of our opinions, we would be more open to other people and intriguing new ideas. That can’t be a bad thing.

Two Brains. So What?

Alas, poor System 1...

Alas, poor System 1…

We have two different thinking systems in our brain, often called System 1 and System 2. System 1 is fast and automatic and makes up to 95% of our decisions. System 2 is a slow energy hog that allows us to think through issues consciously. When we think of thinking, we’re thinking of System 2.

You might ask: Why would this matter to anyone other than neuroscientists? It’s interesting to know but does it have any practical impact? Well, here are some things that we might want to change based on the dual-brain idea.

Economic theory – our classic economic theories depend on the notion of rational people making rational decisions. As Daniel Kahneman points out, that’s not the way the world works. For instance, our loss aversion bias pushes us towards non-rational investment decisions. (See also here). It happens all the time and has created a whole new school of thought called behavioral economics (and a Nobel prize for Kahneman).

Intelligence testing – System 1 makes up to 95% of our decisions but our classic IQ tests focus exclusively on System 2. That doesn’t make sense. We need new tests that incorporate rationality as well as intelligence.

Advertising – we often measure the effectiveness of advertising through awareness tests. Yet System 1 operates below the threshold of awareness. We can know things without knowing that we know them. As Peter Steidl points out, if we make 95% of our decisions in System 1, doesn’t it also follow that we make (roughly) 95% of our purchase decisions in System 1? Branding should focus on our habits and memory rather than our awareness.

Habits (both good and bad) – we know that we shouldn’t procrastinate (or smoke or eat too much, etc.). We know that in System 2, our conscious self. But System 2 doesn’t control our habits; System 1 does. In fact, John Arden calls System 1 the habitual brain. If we want to change our bad habits (or reinforce our goods ones), we need to change the habits and rules stored in System 1. How do we do that? Largely by changing our memories.

Judgment, probability, and public policy – As Daniel Kahneman points out, humans are naturally good at grammar but awful at statistics. We create our mental models in System 1, not System 2. How frequently does something happen? We estimate probability based on how easy it is to retrieve memories. What kinds of memories are easy to retrieve? Any memory that’s especially vivid or scary. Thus, we overestimate the probability of violent crime and underestimate the probability of good deeds. We make policy decisions and public investments based on erroneous – but deeply held – predictions.

Less logic, louder voice – people who aren’t very good at something tend to overestimate their skills. It’s the Dunning-Kruger effect – people don’t recognize their own ineptitude. It’s an artifact of System 1. Experts will often craft their conclusions very carefully with many caveats and warnings. Non-experts don’t know that their expertise is limited; they simply assume that they’re right. Thus, they often speak more loudly. It’s the old saying: “He’s seldom right but never in doubt”.

Teaching critical thinking – I’ve read nearly two-dozen textbooks on critical thinking. None of them give more than a passing remark or two on the essential differences between System 1 and System 2. They focus exclusively on our conscious selves: System 2. In other words, they focus on how we make five per cent of our decisions. It’s time to re-think the way we teach thinking.

My Social Media

YouTube Twitter Facebook LinkedIn

Newsletter Signup
Archives