Since I began teaching critical thinking four years ago, I’ve bought a lot of books on the subject. The other day, I wondered how many of those books I’ve actually read.
So, I made two piles on the floor. In one pile, I stacked all the books that I have read (some more than once). In the other pile, I stacked the books that I haven’t read.
Guess what? The unread stack is about twice as high as the other stack. In other words, I’ve read about a third of the books I’ve acquired on critical thinking and have yet to read about two-thirds.
What can I conclude from this? My first thought: I need to take a vacation and do a lot of reading. My second thought: Maybe I shouldn’t mention this to the Dean.
I also wondered, how much do I not know? Do I really know only a third of what there is to know about the topic? Maybe I know more since there’s bound to be some repetition in the books. Or maybe I know less since my modest collection may not cover the entire topic. Hmmm…
The point is that I’m thinking about what I don’t know rather than what I do know. That instills in me a certain amount of doubt. When I make assertions about critical thinking, I add cautionary words like perhaps or maybe or the evidence suggests. I leave myself room to change my position as new knowledge emerges (or as I acquire knowledge that’s new to me).
I suspect that the world might be better off if we all spent more time thinking about what we don’t know. And it’s not just me. The Dunning-Kruger effect states essentially the same thing.
David Dunning and Justin Kruger, both at Cornell, study cognitive biases. In their studies, they documented a bias that we now call illusory superiority. Simply put, we overestimate our own abilities and skills compared to others. More specifically, the less we know about a given topic, the more likely we are to overestimate our abilities. In other words, the less we know, the more confident we are in our opinions. As David Dunning succinctly puts it, “…incompetent people do not recognize—scratch that, cannot recognize—just how incompetent they are.”
The opposite seems to be true as well. Highly competent people tend to underestimate their competence relative to others. The thinking goes like this: If it’s easy for me, it must be easy for others as well. I’m not so special.
I’ve found that I can use the Dunning-Kruger effect as a rough-and-ready test of credibility. If a source provides a small amount of information with a high degree of confidence, then their credibility declines in my estimation. On the other hand, if the source provides a lot of information with some degree of doubt, then their credibility rises. It’s the difference between recognizing a wise person and a fool.
Perhaps we can use the same concept to greater effect in our teaching. When we learn about a topic, we implicitly learn about what we don’t know. Maybe we should make it more explicit. Maybe we should count the books we’ve read and the books we haven’t read to make it very clear just how much we don’t know. If we were less certain of our opinions, we would be more open to other people and intriguing new ideas. That can’t be a bad thing.
I wonder if scientists have the same doubts about their knowledge?
Loved the Darwin quote: “gnorance more frequently begets confidence than does knowledge.”