When driving home from a party, I may ask Suellen a question like, “Why did Pat make that cutting remark about Kim?” Suellen will then launch into a thorough exegesis about relationships, personal histories, boyfriends, girlfriends, children, parents, gardening, the nature of education, and the tangled web we weave when first we practice to deceive. In the end, it will all make sense — even to me, a socially challenged kind of guy.
Suellen is great at answering questions like these. It’s often referred to as social or emotional intelligence. It’s about people and relationships and empathy. I’m generally better at academic intelligence and questions like how do you calculate the volume of a sphere? (I don’t mean to say that I’m better at academic intelligence than Suellen is … but that I’m better at academic intelligence than I am at social intelligence. I hope that’s clear… I wouldn’t want my lack of social intelligence to lead me to insult my own wife.)
For me, two intelligences — academic and social — have been quite enough. But not for Howard Gardner. In Five Minds for the Future, Gardner suggests that there are five different intelligences and, if education is to succeed in the future, we need to teach them all.
I’m fairly well versed in the tenets of critical thinking. Now I’m trying to understand Gardner’s theory of multiple intelligences. Why? Because I’d like to mash up critical thinking and multiple intelligences. I’m wondering if critical thinking works the same way in each intelligence. Can you think critically in say, academic intelligence, while thinking uncritically in social intelligence? That’s certainly the stereotype of the absent-midned professor.
To mash up critical thinking and the five minds, let’s first look at Gardner’s theory. The five minds are:
Disciplined mind — to master the way of thinking associated with a specific discipline — say, economics, psychology, or mathematics. I think (hope) it’s also broader than that. I’m certainly trained in the Western way of thinking. I categorize and classify things without even thinking about it. I’m now looking at Zen as a different way of thinking — one that destroys categories rather than creates them. That’s certainly a different discipline.
Synthesizing mind — the ability to put it altogether. Gardner points out that memorization was important in times characterized by low literacy. In today’s era of Big Data, synthesis is much more important and memorization much less important.
Creating mind — proposing new ideas, fresh questions, unexpected answers. As I’ve noted before in this blog, a new idea is often a mashup of multiple existing ideas. To propose something that doesn’t exist, you need to be well versed in what does exist.
Respectful mind — “… notes and welcomes differences between human individuals and between human groups….” This is very similar to the concept of fair mindedness as used in critical thinking. This could be our first mashup.
Ethical mind — how can we serve purposes beyond self-interest and how can “citizens…work unselfishly to improve the lot of all.” Again, this is quite similar to concepts used in critical thinking, including ethical thinking and the ability to overcome egocentric thinking.
Today, I simply want to introduce Gardner’s five minds. In future posts, I’ll try to weave together critical thinking, Gardner’s concepts of multiple intelligences, and the Hofstedes’ research on the five dimensions of culture. I hope you’ll tag along.
By the way, the volume of a sphere in 4/3∏r³.
I used to teach research methods. Now I teach critical thinking. Research is about creating knowledge. Critical thinking is about assessing knowledge. In research methods, the goal is to create well-designed studies that allow us to determine whether something is true or not. A well-designed study, even if it finds that something is not true, adds to our knowledge. A poorly designed study adds nothing. The emphasis is on design.
In critical thinking, the emphasis is on assessment. We seek to sort out what is true, not true, or not proven in our info-sphere. To succeed, we need to understand research design. We also need to understand the logic of critical thinking — a stepwise progression through which we can discover fallacies and biases and self-serving arguments. It takes time. In fact, the first rule I teach is “Slow down. Take your time. Ask questions. Don’t jump to conclusions.”
In both research and critical thinking, a key question is: how do we know if something is true? Further, how do we know if we’re being fair minded and objective in making such an assessment? We discuss levels of evidence that are independent of our subjective experience. Over the years, thinkers have used a number of different schemes to categorize evidence and evaluate its quality. Today, the research world seems to be coalescing around a classification of evidence that has been evolving since the early 1990s as part of the movement toward evidence-based medicine (EBM).
The classification scheme (typically) has four levels, with 4 being the weakest and 1 being the strongest. From weakest to strongest, here they are:
You might keep this guide in mind as you read your daily newspaper. Much of the “evidence” that’s presented in the media today doesn’t even reach the minimum standards of Level 4. It’s simply opinion. Stating opinions is fine, as long as we understand that they don’t qualify as credible evidence.
Some years ago, Suellen, Elliot, and I flew from Los Angeles to Sydney, Australia — a long, somewhat dreary, overnight flight with several hundred people on a jumbo jet. The flight was smooth and uneventful with one bizarre exception. About six hours into the flight — as we were all trying to sleep — the plane’s oxygen masks suddenly deployed and fell into our laps. Nothing seemed wrong. There was no noise or bumping or vibration or swerving. Just oxygen masks in our laps. We woke up, looked around at the other passengers, concluded that nothing was wrong … and ignored the masks.
It turned out that we were right. The pilot announced that someone had “pushed the wrong button” in the cockpit and released the masks. He advised us to ignore the masks which we were already successfully doing. Later in the flight, I spoke with a flight attendant who told me she was shocked that none of the passengers had followed the “proper procedures” and donned the masks. I said that it seemed obvious that it wasn’t an emergency. She asked, “How did you know that?” I said, “By looking at the other passengers. Nobody was scared.”
I was reminded of this incident as I was re-reading (yet again) a chapter in Robert Cialdini’s book, Influence. Our little adventure on the airplane was a classic example of social proof. When we’re in an ambiguous situation and not sure what’s happening, one of the first things we do is to look at other people. If they’re panicking, then maybe we should too. If they’re calm, we can relax.
Cialdini points out that social proof affects us even when we’re aware of it. The example? Laugh tracks on TV. We all claim to dislike laugh tracks and also claim that they have no effect on us. But experimental research suggests otherwise. When people watch a TV show with a laugh track, they laugh longer and harder than other people watching the same show without the track. We realize that we’re being manipulated but we still succumb. According to Cialdini, the effect is more pronounced with bad jokes than with good ones. If so, Seth McFarlane clearly needed a laugh track at this year’s Academy Awards.
Cialdini refers to one of the problems of social proof as pluralistic ignorance. I looked around at other people on the airplane and they seemed calm and unfazed. At the same time, they were looking at me and I seemed … well, calm and unfazed. As I looked at them, I thought, “No need to get excited”. As they looked at me, they thought the same. None of us knew what was really going on but we were influencing each other to ignore a potentially life-threatening emergency.
Cialdini argues that pluralistic ignorance makes “safety in numbers” meaningless. (See also my post on the risky shift). Cialdini cites research on staged emergencies — a person apparently has an epileptic seizure. The person is helped “…85 percent of the time when there was a single bystander present but only 31 percent of the time with five bystanders present.” A single bystander seems to assume “if in doubt, help out”. Multiple bystanders look at each other and conclude that there’s no emergency.
So, what to do? If you have to have a heart attack, do it when only one other person is around.
Last week a man was swallowed by a sinkhole while sleeping in Florida. This week, I’m more worried about sinkholes in Florida than I am about driving on icy roads in Colorado. Is that logical?
It’s not logical but it’s very real. Sometimes a story is so vivid, so unexpected, so emotionally fraught, and so available that it dominates our thinking. Even though it’s extremely unlikely, it becomes possible, maybe even probable in our imaginations. As Daniel Kahneman points out, “The world in our heads is not a precise replica of reality.”
What makes a phenomenon more real in our heads than it is in reality? Several things. It’s vivid — it creates a very clear image in our mind. It’s creepy — the vivid image is unpleasant and scary. It’s a “bad” death as opposed to a “good” death. We read about deaths every day. When we read about a kindly old nonagenarian dying peacefully after a lifetime of doing good works, it seems natural and honorable. It’s a good death. When we read about someone killed in the prime of life in bizarre or malevolent circumstances, it’s a “bad” death. A bad death is much more vivid than a good death.
But what really makes an image dominate our minds is availability. How easy is it to bring an instance to mind? If the thought is readily available to us, we deem it to be likely. What’s readily available? Anything that’s in the popular media and the topic of discussion with friends and colleagues. If your colleagues around the water cooler say, “Hey, did you hear about the guy in the sinkhole?” you’ve already begun to blow it out of proportion.
Availability can also compound itself in what Kahneman calls an availability cascade. The story itself becomes the story. Suppose that a suspicious compound — let’s call it chenesium — is found in the food supply. Someone writes that chenesium causes cancer in rats when administered in huge doses. Plus, it’s a vividly scary form of cancer — it affects the eyeballs and makes you look like a zombie. People start writing letters to the editor about the grave danger. Now it’s in the media. People start marching on state capitals, demanding action. The media write about the marching. People read about the marching and assume that where there’s smoke, there’s fire. The Surgeon General issues a statement saying the danger is minimal. But the populace — now worked into a frenzy — denounce her as a lackey of chenesium producers. Note that the media is no longer writing about chenesium. Rather, they’re writing about the controversy surrounding chenesium. The story keeps growing because it’s a good story. It’s a perfect storm.
So, what to do? Unfortunately, facts don’t matter a whole lot by this point. As Kahneman notes (quoting Jonathan Haidt), “The emotional tail wags the rational dog.” The only thing to do is to let it play out … sooner or later, another controversy will arise to take chenesium’s place.
At the personal level, we can spare ourselves a lot of worry by pondering the availability bias and remembering that facts do matter. We can look up the probability of succumbing to a sinkhole. If we do, we’ll realize that the danger is vanishingly small. There’s nothing to worry about. Still, I’m not going to Florida anytime soon.
My mother was a great lady but not a great cook. TV dinners were a popular option at our house when I was a kid. If we weren’t eating TV dinners we might have to eat … frozen fish sticks. I can still smell the oily odor of limp fish sticks frying up in the little Sunbeam electric skillet. It permeated everything. I grew up in a clean-your-plate family so, ultimately, I had to choke down those mysterious fish parts. Then, without fail, I raced to the bathroom and threw up.
How does one think critically about such a situation? In our family, we quickly ruled out several non-causes. Everyone else in the family ate fish sticks and didn’t get sick. Therefore, it couldn’t be the fish sticks. Every serving of fish sticks made me sick, so we couldn’t blame it on just one box that had gone bad. Clearly, I must be allergic to fish.
So, from the age of about six to 23, I ate no fish at all. No trout or tuna or herring or salmon or swordfish. After college, I moved to Ecuador and, from time to time, took vacations to the beach. On one such vacation, I found that there was nothing to eat locally but fish. Finally, I sat in a restaurant, braced myself for the worst, and took a bite of fish. I thought, “Wow, this is really good!” I wasn’t allergic to fish at all … just greasy, stinky frozen fish sticks.
I had been framing myself. I made an assumption about myself based on faulty evidence and stuck with it for almost 17 years. I never thought to re-check the original assumption or re-validate the evidence. I never tried to think outside the frame. Over the past several weeks, I’ve written about the issues of what might be called “external framing” in this blog. Here are some examples:
In these cases, one person is framing another. The same thing can happen to abstract issues. I’ve noticed that Republicans and Democrats frame the same issue in very different ways.
What we forget sometimes is that we can also frame ourselves. I used to teach a course in research methods that included a mild dose of inferential statistics. Many of my students were women in their 30s and 40s who were returning to school to re-start their careers. Many of them were very nervous about the statistics. They believed they weren’t good at math. As it turned out, they did just fine. They weren’t bad at math; they had just framed themselves into believing they were bad at math.
If you believe something about yourself, you might just want to poke at it a bit. Sometimes you may just be wrong. On other occasions, you may be right — I’m still terrible at anything having to do with music. Still, it’s worth asking the question. Otherwise, you may miss out on some very tasty fish.