
What don’t you see?
The problem with seeing is that you only see what you see. We may see something and try to make reasonable deductions from it. We assume that what we see is all there is. All too often, the assumption is completely erroneous. We wind up making decisions based on partial evidence. Our conclusions are wrong and, very often, consistently biased. We make the same mistake in the same way consistently over time.
As Daniel Kahneman has taught us: what you see isn’t all there is. We’ve seen one of his examples in the story of Steve. Kahneman present this description:
Steve is very shy and withdrawn, invariably helpful but with little interest in people or in the world of reality. A meek and tidy soul, he has a need for order and structure, and a passion for detail.
Kahneman then asks if it’s more likely that Steve is a farmer or a librarian?
If you read only what’s presented to you, you’ll most likely guess wrong. Kahneman wrote the description to fit our stereotype of a male librarian. But male farmers outnumber male librarians by a ratio of about 20:1. Statistically, it’s much more likely that Steve is a farmer. If you knew the base rate, you would guess Steve is a farmer.
We saw a similar example with World War II bombers. Allied bombers returned to base bearing any number of bullet holes. To determine where to place protective armor, analysts mapped out the bullet holes. The key question: which sections of the bomber were most likely to be struck? Those are probably good places to put the armor.
But the analysts only saw planes that survived. They didn’t see the planes that didn’t make it home. If they made their decision based only on the planes they saw, they would place the armor in spots where non-lethal hits occurred. Fortunately, they realized that certain spots were under-represented in their bullet hole inventory – spots around the engines. Bombers that were hit in the engines often didn’t make it home and, thus, weren’t available to see. By understanding what they didn’t see, analysts made the right choices.
I like both of these examples but they’re somewhat abstract and removed from our day-to-day experience. So, how about a quick test of our abilities? In the illustration above, which way is the bus going?
Study the image for a while. I’ll post the answer soon.

Don’t be cowed.
In the mid-1790s, an English country doctor named Edward Jenner made a rather routine observation: milkmaids don’t get smallpox. Milkmaids were often exposed to cowpox, a disease that’s related to smallpox but much less deadly. Cowpox gave the milkmaids flu-like symptoms that were distressing but certainly not lethal. Jenner guessed that the cowpox also conferred immunity to smallpox.
Jenner wasn’t the first to observe the cowpox effect but he was the first in the western world to act on his hunch. He created a vaccine from the scraping of cowpox pustules and administered it to some two-dozen people. They all acquired the immunity to smallpox. Jenner conducted experiments to demonstrate the treatment’s efficacy as well as the biological mechanisms in play. As a result, he is often described as the father of modern immunology.
Jenner’s breakthrough came from simple observation. He paid close attention to the world around him, observed an anomaly, and acted on it. Observation provides a foundation for both critical thinking and innovation. If necessity is the mother of invention, then observation is the grandmother. One has to observe the necessity in order to address it. (If a necessity happens in the forest and no one observes it, is it really necessary?)
How does one learn to be a good observer? Interestingly, most critical thinking textbooks don’t address this. Rather, they teach readers how to ask insightful, clarifying questions. That’s useful, of course, but also somewhat limited. Observation is merely a continuation of questioning by other means. Much more than questioning, observation can reveal fundamental insights that produce important innovations – like Jenner’s.
How does one become a good observer? Here are some thoughts I’ve gleaned from reading and from my own experience.
Pay attention – this may seem obvious but it’s hard to do. We’ve all had the experience of driving somewhere and not remembering how we got there. The mind wanders. What to do? Remind yourself to stay in the moment. Make mental notes. Ask yourself why questions. Mindfulness training may help.
Keep a journal – you can’t observe everything a given moment. Observations grow and change over time. You may have half a good idea today. The other half may not occur to you for years. Steve Johnson calls it a slow hunch. The trick to a slow hunch is remembering the first half. If it’s written down, it’s much easier to recall. (Indeed, one of the reasons I write this blog is to remember what I’ve learned).
Slow down – it’s much easier to think clearly and observe effectively if you take your time. The pace of change may well be accelerating but accelerating your thinking is not going to help you.
Pay attention to System 1 – your fast, automatic system knows what the world is supposed to be like. It can alert you to anomalies that System 2 doesn’t recognize.
Enhance your chance by broadening your horizons – Pasteur said, “Chance favors only the prepared mind.” You can prepare your mind by reading widely and by interacting with people who have completely different experiences than yours. Diversity counts.
Look for problems/listen to complaints – if a person is having a problem with something, it creates an opportunity to fix it.
Test your hypothesis – in other words, do something. Your hypothesis may be wrong but you’ll almost certainly learn something by testing it.
Observing is not always easy but it is a skill that can be learned. Many of the people we call geniuses are often superb observers more than anything else. Like Edward Jenner.
Postscript – Jenner inoculated his first patient in May 1796. Once Jenner showed its efficacy, the treatment spread quickly. So did opposition to it. In 1802, James Gillray, a popular English caricaturist, created the illustration above. Opponents claimed that cows would grow out of the bodies of people who received cowpox vaccines. Anti-vaccine agitation has been entwined with public health initiatives since the very beginning.

Alas, poor System 1…
We have two different thinking systems in our brain, often called System 1 and System 2. System 1 is fast and automatic and makes up to 95% of our decisions. System 2 is a slow energy hog that allows us to think through issues consciously. When we think of thinking, we’re thinking of System 2.
You might ask: Why would this matter to anyone other than neuroscientists? It’s interesting to know but does it have any practical impact? Well, here are some things that we might want to change based on the dual-brain idea.
Economic theory – our classic economic theories depend on the notion of rational people making rational decisions. As Daniel Kahneman points out, that’s not the way the world works. For instance, our loss aversion bias pushes us towards non-rational investment decisions. (See also here). It happens all the time and has created a whole new school of thought called behavioral economics (and a Nobel prize for Kahneman).
Intelligence testing – System 1 makes up to 95% of our decisions but our classic IQ tests focus exclusively on System 2. That doesn’t make sense. We need new tests that incorporate rationality as well as intelligence.
Advertising – we often measure the effectiveness of advertising through awareness tests. Yet System 1 operates below the threshold of awareness. We can know things without knowing that we know them. As Peter Steidl points out, if we make 95% of our decisions in System 1, doesn’t it also follow that we make (roughly) 95% of our purchase decisions in System 1? Branding should focus on our habits and memory rather than our awareness.
Habits (both good and bad) – we know that we shouldn’t procrastinate (or smoke or eat too much, etc.). We know that in System 2, our conscious self. But System 2 doesn’t control our habits; System 1 does. In fact, John Arden calls System 1 the habitual brain. If we want to change our bad habits (or reinforce our goods ones), we need to change the habits and rules stored in System 1. How do we do that? Largely by changing our memories.
Judgment, probability, and public policy – As Daniel Kahneman points out, humans are naturally good at grammar but awful at statistics. We create our mental models in System 1, not System 2. How frequently does something happen? We estimate probability based on how easy it is to retrieve memories. What kinds of memories are easy to retrieve? Any memory that’s especially vivid or scary. Thus, we overestimate the probability of violent crime and underestimate the probability of good deeds. We make policy decisions and public investments based on erroneous – but deeply held – predictions.
Less logic, louder voice – people who aren’t very good at something tend to overestimate their skills. It’s the Dunning-Kruger effect – people don’t recognize their own ineptitude. It’s an artifact of System 1. Experts will often craft their conclusions very carefully with many caveats and warnings. Non-experts don’t know that their expertise is limited; they simply assume that they’re right. Thus, they often speak more loudly. It’s the old saying: “He’s seldom right but never in doubt”.
Teaching critical thinking – I’ve read nearly two-dozen textbooks on critical thinking. None of them give more than a passing remark or two on the essential differences between System 1 and System 2. They focus exclusively on our conscious selves: System 2. In other words, they focus on how we make five per cent of our decisions. It’s time to re-think the way we teach thinking.

What else should I consider?
One of the critical skills of critical thinking is the ability to ask insightful, illuminating questions. Indeed, questioning is an attitude as much as a skill. As information flows into our brains, we know that it needs to be checked for clarity and accuracy. We assume the same attitude that a warehouse manager might have: all incoming deliveries need to be checked.
I always keep two simple questions in mind:
Why do I/you think that?
How do I/you know that?
These help me review incoming information quickly. I can then assign the information to three large – and fairly fuzzy – categories:
Probably true
Probably not true
Unproven/can’t evaluate
The virtue of my little system is not that it’s 100% accurate but that it’s simple. I can do it in real-time and keep rough track of ideas and concepts while discussing them. It’s fuzzy and lumpy but also quite handy.
When I asked my students about their go-to questions, they came up with five categories in a stepwise process, beginning with gaining self-control and ending with fixing the process. Their system covers more bases than mine and the categories are more structured. The downside? I find it hard to remember five things simultaneously. It’s great for moments of quiet reflection and evaluation but less useful for real-time discussions.
I may have found a compromise between the two systems. Joe Y.F. Lau calls it the fourfold path to good thinking in his book, An Introduction to Critical Thinking and Creativity. The four questions in Lau’s path are straightforward:
The last question seems very similar to two other sources. As Daniel Kahneman reminds us: What you see is not all there is. As the Brothers Heath remind us: the first step of the WRAP decision process is, Widen Your Options.
The output of Lau’s system becomes the input to Kahneman and Heaths’ decision-making frameworks. The beauty of Lau’s system is that it verifies and evaluates the information before it gets into the decision process. Like the warehouse manager, we ensure the quality of the input. That’s necessary (but not sufficient) to ensure the quality of the output.
The other virtue of Lau’s approach is its simplicity. I can keep four points in mind (most of the time). Perhaps we can simplify it even further by creating an acronym from it. Anyone want to try?

One building or two?
A fact is a fact is a fact. Isn’t it? Well … not so fast.
Let’s think of a simple fact … like the number of rooms in a building. That seems factual, doesn’t it? It’s observable, verifiable, reliable, and objective. Multiple, independent observers should come to the same conclusion. If John says it’s X, Mary can verify his work by counting again. Further, it doesn’t change over time. Today’s answer should be the same as tomorrow’s. Additionally, it’s not subjective; it doesn’t depend on your mood. It’s objective – it exists in the real world, not just inside your head.
That’s what you may think but it’s actually a bit more complicated. I often run an experiment in my classes and workshops that shows how difficult it can be to pin down a fact.
I start by asking for three volunteers. My instructions are simple: “I want you to answer the question: how many rooms are on this floor of this building?” If they ask questions, I say, in a slightly superior voice, “This is a simple exercise. There’s no need to ask questions. Don’t make it more complicated than it is. Just go and do it.” (Did you ever have a boss that did this to you?)
Then I send the volunteers out one at a time to count the rooms. Each one is instructed to count the number of rooms, write the number on a piece of paper, give the paper to me, and tell no one else the number.
I collect the three answers and compare them for the class. The three answers are never the same. In fact, they often vary by a factor of two. One person may count 24 rooms; another counts 48. Occasionally, I’ll find that two answers are the same but, if so, the third is usually quite a bit different.
How do we account for this? Two thoughts come to mind. First, the exercise contains three different concepts that need to be defined: room, floor, and building. What is a room? Does a janitor’s closet count? A bathroom? What’s a floor in this sense? What if you’re in split-level building? And what’s a building? What if two buildings are joined together, as is the building where I teach at the University of Denver?
Second, this helps to illustrate how reality is an internal concept. Each volunteer has a model of reality in her head. But the models are different. In some sense, there is no external, verifiable reality. We just build models and your model is different from mine. The result? Endless confusion and controversy. Moral: be careful with your facts.