In February, I wrote about premature commitment. According to Paul Nutt in his book, Why Decisions Fail, premature commitments all too often lead to debacles — decisions gone spectacularly and publicly wrong. The process is fairly simple: 1) we have a problem; 2) a beguiling solution is proposed; 3) we jump on the solution with undue haste and without considering our options or searching for alternatives. After all, we have a solution, don’t we? Why bother looking for another one?
As we read Nutt’s book in my classes, I can tell that students are grasping the general concept intellectually. It’s clear — intellectually and academically — that you shouldn’t commit too soon. Step back, look around, ask questions, survey the possibilities — then make a decision.
That’s all well and good in the classroom but will my students actually be patient when the pressure is on and everyone wants to be a hero? I’m not so sure. So, I’ve been looking for ways to show students what it feels like to make a premature commitment. By experiencing the process — rather than just reading about it — I’m hoping to imprint something on them. When you’re under pressure and a crisis is looming, it’s hard to think clearly. It’s easier to remember an experience than it is to organize your thoughts and respond to a novel situation.
I’ve discovered a video that helps students make the connection. Actually, I’ve known about the video for some time but I used to use it for a different purpose. Then it dawned on me that the video provides a good demonstration of a premature commitment. So, I’m re-purposing the way I teach it. Perhaps that’s an example of mashup thinking.
The video requires you to concentrate your attention for about 90 seconds and count the number of times a specific action happens. Here’s what I’d like you to do: Watch the video twice. The first time, focus intently on the task at hand (the video will explain what to do). Count the number of times the specified action happens and record the number. There is one (and only one) correct answer. Then watch the video a second time and don’t bother to count. Just observe what goes on. Don’t read on until you’ve watched the video twice. You can find the video here.
Watch the video (twice) before proceeding
Did you miss anything the first time you watched the video? Did you notice it the second time? (I’m not going to give it away here but, if you find this confusing, send me an e-mail and I’ll explain it).
About two-thirds of the people who follow the instructions miss an important element of the video the first time they watch it. Perhaps the key phrase here is “people who follow the instructions”. Basically, I conned you into making a premature commitment. I convinced you that — to get the right answer — you needed to pay close attention to the action and count carefully. You decided that it was important to get the right answer, so you played by the rules I imposed. Because you played by the rules, you missed something important in the environment.
What’s the message here? It’s easy to get caught up in the situation. It’s easy to buy into the “rules” that a situation seems to impose on you. It’s easy to let other people rush you to judgment. It’s easy to con yourself. The next time you’re at work and a problem arises and everybody is rushing to find a solution, just ask yourself: “Am I missing the gorilla?”
When driving home from a party, I may ask Suellen a question like, “Why did Pat make that cutting remark about Kim?” Suellen will then launch into a thorough exegesis about relationships, personal histories, boyfriends, girlfriends, children, parents, gardening, the nature of education, and the tangled web we weave when first we practice to deceive. In the end, it will all make sense — even to me, a socially challenged kind of guy.
Suellen is great at answering questions like these. It’s often referred to as social or emotional intelligence. It’s about people and relationships and empathy. I’m generally better at academic intelligence and questions like how do you calculate the volume of a sphere? (I don’t mean to say that I’m better at academic intelligence than Suellen is … but that I’m better at academic intelligence than I am at social intelligence. I hope that’s clear… I wouldn’t want my lack of social intelligence to lead me to insult my own wife.)
For me, two intelligences — academic and social — have been quite enough. But not for Howard Gardner. In Five Minds for the Future, Gardner suggests that there are five different intelligences and, if education is to succeed in the future, we need to teach them all.
I’m fairly well versed in the tenets of critical thinking. Now I’m trying to understand Gardner’s theory of multiple intelligences. Why? Because I’d like to mash up critical thinking and multiple intelligences. I’m wondering if critical thinking works the same way in each intelligence. Can you think critically in say, academic intelligence, while thinking uncritically in social intelligence? That’s certainly the stereotype of the absent-midned professor.
To mash up critical thinking and the five minds, let’s first look at Gardner’s theory. The five minds are:
Disciplined mind — to master the way of thinking associated with a specific discipline — say, economics, psychology, or mathematics. I think (hope) it’s also broader than that. I’m certainly trained in the Western way of thinking. I categorize and classify things without even thinking about it. I’m now looking at Zen as a different way of thinking — one that destroys categories rather than creates them. That’s certainly a different discipline.
Synthesizing mind — the ability to put it altogether. Gardner points out that memorization was important in times characterized by low literacy. In today’s era of Big Data, synthesis is much more important and memorization much less important.
Creating mind — proposing new ideas, fresh questions, unexpected answers. As I’ve noted before in this blog, a new idea is often a mashup of multiple existing ideas. To propose something that doesn’t exist, you need to be well versed in what does exist.
Respectful mind — “… notes and welcomes differences between human individuals and between human groups….” This is very similar to the concept of fair mindedness as used in critical thinking. This could be our first mashup.
Ethical mind — how can we serve purposes beyond self-interest and how can “citizens…work unselfishly to improve the lot of all.” Again, this is quite similar to concepts used in critical thinking, including ethical thinking and the ability to overcome egocentric thinking.
Today, I simply want to introduce Gardner’s five minds. In future posts, I’ll try to weave together critical thinking, Gardner’s concepts of multiple intelligences, and the Hofstedes’ research on the five dimensions of culture. I hope you’ll tag along.
By the way, the volume of a sphere in 4/3∏r³.
I used to teach research methods. Now I teach critical thinking. Research is about creating knowledge. Critical thinking is about assessing knowledge. In research methods, the goal is to create well-designed studies that allow us to determine whether something is true or not. A well-designed study, even if it finds that something is not true, adds to our knowledge. A poorly designed study adds nothing. The emphasis is on design.
In critical thinking, the emphasis is on assessment. We seek to sort out what is true, not true, or not proven in our info-sphere. To succeed, we need to understand research design. We also need to understand the logic of critical thinking — a stepwise progression through which we can discover fallacies and biases and self-serving arguments. It takes time. In fact, the first rule I teach is “Slow down. Take your time. Ask questions. Don’t jump to conclusions.”
In both research and critical thinking, a key question is: how do we know if something is true? Further, how do we know if we’re being fair minded and objective in making such an assessment? We discuss levels of evidence that are independent of our subjective experience. Over the years, thinkers have used a number of different schemes to categorize evidence and evaluate its quality. Today, the research world seems to be coalescing around a classification of evidence that has been evolving since the early 1990s as part of the movement toward evidence-based medicine (EBM).
The classification scheme (typically) has four levels, with 4 being the weakest and 1 being the strongest. From weakest to strongest, here they are:
You might keep this guide in mind as you read your daily newspaper. Much of the “evidence” that’s presented in the media today doesn’t even reach the minimum standards of Level 4. It’s simply opinion. Stating opinions is fine, as long as we understand that they don’t qualify as credible evidence.
Some years ago, Suellen, Elliot, and I flew from Los Angeles to Sydney, Australia — a long, somewhat dreary, overnight flight with several hundred people on a jumbo jet. The flight was smooth and uneventful with one bizarre exception. About six hours into the flight — as we were all trying to sleep — the plane’s oxygen masks suddenly deployed and fell into our laps. Nothing seemed wrong. There was no noise or bumping or vibration or swerving. Just oxygen masks in our laps. We woke up, looked around at the other passengers, concluded that nothing was wrong … and ignored the masks.
It turned out that we were right. The pilot announced that someone had “pushed the wrong button” in the cockpit and released the masks. He advised us to ignore the masks which we were already successfully doing. Later in the flight, I spoke with a flight attendant who told me she was shocked that none of the passengers had followed the “proper procedures” and donned the masks. I said that it seemed obvious that it wasn’t an emergency. She asked, “How did you know that?” I said, “By looking at the other passengers. Nobody was scared.”
I was reminded of this incident as I was re-reading (yet again) a chapter in Robert Cialdini’s book, Influence. Our little adventure on the airplane was a classic example of social proof. When we’re in an ambiguous situation and not sure what’s happening, one of the first things we do is to look at other people. If they’re panicking, then maybe we should too. If they’re calm, we can relax.
Cialdini points out that social proof affects us even when we’re aware of it. The example? Laugh tracks on TV. We all claim to dislike laugh tracks and also claim that they have no effect on us. But experimental research suggests otherwise. When people watch a TV show with a laugh track, they laugh longer and harder than other people watching the same show without the track. We realize that we’re being manipulated but we still succumb. According to Cialdini, the effect is more pronounced with bad jokes than with good ones. If so, Seth McFarlane clearly needed a laugh track at this year’s Academy Awards.
Cialdini refers to one of the problems of social proof as pluralistic ignorance. I looked around at other people on the airplane and they seemed calm and unfazed. At the same time, they were looking at me and I seemed … well, calm and unfazed. As I looked at them, I thought, “No need to get excited”. As they looked at me, they thought the same. None of us knew what was really going on but we were influencing each other to ignore a potentially life-threatening emergency.
Cialdini argues that pluralistic ignorance makes “safety in numbers” meaningless. (See also my post on the risky shift). Cialdini cites research on staged emergencies — a person apparently has an epileptic seizure. The person is helped “…85 percent of the time when there was a single bystander present but only 31 percent of the time with five bystanders present.” A single bystander seems to assume “if in doubt, help out”. Multiple bystanders look at each other and conclude that there’s no emergency.
So, what to do? If you have to have a heart attack, do it when only one other person is around.
Last week a man was swallowed by a sinkhole while sleeping in Florida. This week, I’m more worried about sinkholes in Florida than I am about driving on icy roads in Colorado. Is that logical?
It’s not logical but it’s very real. Sometimes a story is so vivid, so unexpected, so emotionally fraught, and so available that it dominates our thinking. Even though it’s extremely unlikely, it becomes possible, maybe even probable in our imaginations. As Daniel Kahneman points out, “The world in our heads is not a precise replica of reality.”
What makes a phenomenon more real in our heads than it is in reality? Several things. It’s vivid — it creates a very clear image in our mind. It’s creepy — the vivid image is unpleasant and scary. It’s a “bad” death as opposed to a “good” death. We read about deaths every day. When we read about a kindly old nonagenarian dying peacefully after a lifetime of doing good works, it seems natural and honorable. It’s a good death. When we read about someone killed in the prime of life in bizarre or malevolent circumstances, it’s a “bad” death. A bad death is much more vivid than a good death.
But what really makes an image dominate our minds is availability. How easy is it to bring an instance to mind? If the thought is readily available to us, we deem it to be likely. What’s readily available? Anything that’s in the popular media and the topic of discussion with friends and colleagues. If your colleagues around the water cooler say, “Hey, did you hear about the guy in the sinkhole?” you’ve already begun to blow it out of proportion.
Availability can also compound itself in what Kahneman calls an availability cascade. The story itself becomes the story. Suppose that a suspicious compound — let’s call it chenesium — is found in the food supply. Someone writes that chenesium causes cancer in rats when administered in huge doses. Plus, it’s a vividly scary form of cancer — it affects the eyeballs and makes you look like a zombie. People start writing letters to the editor about the grave danger. Now it’s in the media. People start marching on state capitals, demanding action. The media write about the marching. People read about the marching and assume that where there’s smoke, there’s fire. The Surgeon General issues a statement saying the danger is minimal. But the populace — now worked into a frenzy — denounce her as a lackey of chenesium producers. Note that the media is no longer writing about chenesium. Rather, they’re writing about the controversy surrounding chenesium. The story keeps growing because it’s a good story. It’s a perfect storm.
So, what to do? Unfortunately, facts don’t matter a whole lot by this point. As Kahneman notes (quoting Jonathan Haidt), “The emotional tail wags the rational dog.” The only thing to do is to let it play out … sooner or later, another controversy will arise to take chenesium’s place.
At the personal level, we can spare ourselves a lot of worry by pondering the availability bias and remembering that facts do matter. We can look up the probability of succumbing to a sinkhole. If we do, we’ll realize that the danger is vanishingly small. There’s nothing to worry about. Still, I’m not going to Florida anytime soon.