I’ve been reading David Brooks’ book, The Social Animal: The Hidden Sources of Love, Character, and Achievement. The basic idea is fairly simple: we are not alone. How we interact with each other strongly influences who we are and what we become.
Often, however, we don’t recognize just how strong those social forces are. Many of them operate at subconscious levels. Citing Strangers to Ourselves, by Timothy Wilson, Brooks estimates that our minds can take in 11 million pieces of information at any given time. But we’re only aware of 40 of them, at most. Wilson writes that, “Some researchers … suggest that the unconscious mind does virtually all the work and that conscious will may be an illusion.”
Brooks compares the conscious mind to a “… general atop a platform, who sees the world from a distance…” while the unconscious mind is “…like a million little scouts.” The scouts “… maintain no distance from the environment around them, but are immersed in it.”
Brooks also cites Daniel Patrick Moynihan in writing that “… the central evolutionary truth is that the unconscious matters most. The central humanistic truth is that the conscious mind can influence the unconscious.”
If you think that this sounds like System 1 and System 2 that Daniel Kahneman writes about (click here), well… you’re probably right. System 1 is always on, it’s automatic, and it makes quick decisions, often without your realizing it. System 1 is the default setting. Unless System 2 intervenes, System 1 will spin merrily along, running your life. While System 1 is right most of the time, it can make systematic mistakes.
While Brooks’ writing covers similar territory, he approaches from a different angle than Kahneman. He treats not only the influence of the two systems but also the influence of others. How we behave is remarkably influenced by other people.
I expect to write more about Brooks and Kahneman and how they compare. Today, however, I’ll just summarize some interesting tidbits that I’ve picked up from Brooks.
I hope these tidbits capture your imagination. They certainly have captured mine and I’ll write a lot more about Brooks and Kahneman in the coming weeks.
Last week a man was swallowed by a sinkhole while sleeping in Florida. This week, I’m more worried about sinkholes in Florida than I am about driving on icy roads in Colorado. Is that logical?
It’s not logical but it’s very real. Sometimes a story is so vivid, so unexpected, so emotionally fraught, and so available that it dominates our thinking. Even though it’s extremely unlikely, it becomes possible, maybe even probable in our imaginations. As Daniel Kahneman points out, “The world in our heads is not a precise replica of reality.”
What makes a phenomenon more real in our heads than it is in reality? Several things. It’s vivid — it creates a very clear image in our mind. It’s creepy — the vivid image is unpleasant and scary. It’s a “bad” death as opposed to a “good” death. We read about deaths every day. When we read about a kindly old nonagenarian dying peacefully after a lifetime of doing good works, it seems natural and honorable. It’s a good death. When we read about someone killed in the prime of life in bizarre or malevolent circumstances, it’s a “bad” death. A bad death is much more vivid than a good death.
But what really makes an image dominate our minds is availability. How easy is it to bring an instance to mind? If the thought is readily available to us, we deem it to be likely. What’s readily available? Anything that’s in the popular media and the topic of discussion with friends and colleagues. If your colleagues around the water cooler say, “Hey, did you hear about the guy in the sinkhole?” you’ve already begun to blow it out of proportion.
Availability can also compound itself in what Kahneman calls an availability cascade. The story itself becomes the story. Suppose that a suspicious compound — let’s call it chenesium — is found in the food supply. Someone writes that chenesium causes cancer in rats when administered in huge doses. Plus, it’s a vividly scary form of cancer — it affects the eyeballs and makes you look like a zombie. People start writing letters to the editor about the grave danger. Now it’s in the media. People start marching on state capitals, demanding action. The media write about the marching. People read about the marching and assume that where there’s smoke, there’s fire. The Surgeon General issues a statement saying the danger is minimal. But the populace — now worked into a frenzy — denounce her as a lackey of chenesium producers. Note that the media is no longer writing about chenesium. Rather, they’re writing about the controversy surrounding chenesium. The story keeps growing because it’s a good story. It’s a perfect storm.
So, what to do? Unfortunately, facts don’t matter a whole lot by this point. As Kahneman notes (quoting Jonathan Haidt), “The emotional tail wags the rational dog.” The only thing to do is to let it play out … sooner or later, another controversy will arise to take chenesium’s place.
At the personal level, we can spare ourselves a lot of worry by pondering the availability bias and remembering that facts do matter. We can look up the probability of succumbing to a sinkhole. If we do, we’ll realize that the danger is vanishingly small. There’s nothing to worry about. Still, I’m not going to Florida anytime soon.
An ambulance, racing to the hospital, siren blaring, approaches an intersection. At the same time, from a different direction, a fire truck, racing to a fire, approaches the same intersection. From a third direction, a police car screeches toward the same intersection, responding to a burglary-in-progress call. From a fourth direction, a U.S. Mail truck trundles along to the same intersection. All four vehicles arrive at the same time at the same intersection controlled by a four-way stop sign. Who has the right of way?
The way I just told this story sets a frame around it that may (or may not) guide your thinking. You can look at the story from inside the frame or outside it. If you look inside the frame, you’ll pursue the internal logic of the story. The three emergency vehicles are all racing to save people — from injury, from fire, or from burglary. Which one of those is the worst case? Which one deserves to go first? It’s a tough call.
On the other hand, you could look at the story outside the frame. Instead of pursuing the internal logic, you look at the structure of the story. Rather than getting drawn into the story, you look at it from a distance. One of the first things you’ll notice is that three of the vehicles belong to the same category — emergency vehicles in full crisis mode. The fourth vehicle is different — it’s a mail truck. Could that be a clue? Indeed it is. The “correct” answer to this somewhat apocryphal story is that the mail truck has the right of way. Why? It’s a federal government vehicle and takes precedence over the other, local government vehicles.
In How Doctors Think, Jerome Groopman describes how doctors think inside the frame. A young woman is diagnosed with anorexia and bulimia. Many years later, she’s doing poorly and losing weight steadily. Her medical file is six inches thick. Each time she visits a new doctor, the medical file precedes her. The new doctor reads it, discovers that she’s bulimic and anorexic and treats her accordingly. Finally, a new doctor sets aside her record, pulls out a blank sheet of paper, looks at the woman and says, “Tell me your story.” In telling her own story, the woman gives important clues that leads to a new diagnosis — she’s gluten-intolerant. The new doctor stepped outside the frame of the medical record and gained valuable insights.
According to Franco Moretti, similar frames exist in literature — they’re called books. Traditional literary analysis demands that you read books and study them very closely. Moretti, an Italian literary scholar, calls this close reading — it’s studying literature inside the frame set by the book. Moretti advocates a different approach that he calls distant reading….”understanding literature not by studying particular texts, but by aggregating and analyzing massive amounts of data.” Only by stepping back and reading ourside the frame, can we understand “…the true scope and nature of literature.”
In each of these examples we have a frame. In the first story, I set the frame for you. It’s a riddle and I was trying to trick you. In the second story, the patient’s medical record creates the frame. In the third, the book sets the frame. In each case, we can enter the frame and study the problem closely or we can step back and observe the structure of the problem. It’s often a good idea to step inside the frame — after all, you usually do want your doctor to read your medical file. But it’s also useful to step outside the frame, where you can find clues that you would never find by studying the internal logic of the problem. In fact, I think this approach can help us understand “big” predictions like the cost of healthcare. More on that next Monday.
There’s a widespread meme in American culture that guys are not good at making commitments. While that may be true in romance, it seems that the opposite — premature commitment — is a much bigger problem in business.
That’s the opinion I’ve formed from reading Paul Nutt’s book, Why Decisions Fail. Nutt analyzes 15 debacles which he defines as “… costly decisions that went very wrong and became public…” Indeed, some of the debacles — the Ford Pinto, Nestlé infant formula, Shell Oil’s Brent Spar disposal — not only went public but created firestorms of indignation.
While each debacle had its own special set of circumstances, each also had one common feature: premature commitment. Decision makers were looking for ideas, found one that seemed to work, latched on to it, pursued it, and ignored other equally valid alternatives. Nutt doesn’t use the terminology, but in critical thinking circles this is known as satisficing or temporizing.
Here are two examples from Nutt’s book:
Ohio State University and Big Science — OSU wanted to improve its offerings (and its reputation) in Big Science. At the same time, professors in the university’s astronomy department were campaigning for a new observatory. The university’s administrators latched on to the observatory idea and pursued it, to the exclusion of other ideas. It turns out that Ohio is not a great place to build an observatory. On the other hand, Arizona is. As the idea developed, it became an OSU project to build an observatory in Arizona. Not surprisingly, the Ohio legislature asked why Ohio taxes were being spent in Arizona. It went downhill from there.
EuroDisney — Disney had opened very successful theme parks in Anaheim, Orlando, and Tokyo and sought to replicate their success in Europe. Though they considered some 200 sites, they quickly narrowed the list to the Paris area. Disney executives let it be known that it had always been “Walt’s dream” to build near Paris. Disney pursued the dream rather than closely studying the local situation. For instance, they had generated hotel revenues in their other parks. Why wouldn’t they do the same in Paris? Well… because Paris already had lots of hotel rooms and an excellent public transportation system. So, visitors saw EuroDisney as a day trip, not an overnight destination. Disney officials might have taken fair warning from an early press conference in Paris featuring their CEO, Michael Eisner. He was pelted with French cheese.
In both these cases — and all the other cases cited by Nutt — executives rushed to judgment. As Nutt points out, they then compounded their error by misusing their resources. Instead of using resources to identify and evaluate alternatives, they invested their money and energy in studies designed to justify the alternative they had already selected.
So, what to do? When approaching a major decision, don’t satsifice. Don’t take the first idea that comes along, no matter how attractive it is. Rather, take a step back. (I often do this literally — it does affect your thinking). Ask the bigger question — What’s the best way to improve Big Science at OSU? — rather than the narrower question — What’s the best way to build a telescope?
Here’s a cute little joke:
The receptionist at the doctor’s office goes running down the hallway and says, “Doctor, Doctor, there’s an invisible man in the waiting room.” The Doctor considers this information for a moment, pauses, and then says, “Tell him I can’t see him”.
It’s a cute play on a situation we’ve all faced at one time or another. We need to see a doctor but we don’t have an appointment and the doc just has no time to see us. We know how it feels. That’s part of the reason the joke is funny.
Now let’s talk about the movie playing in your head. Whenever we hear or read a story, we create a little movie in our heads to illustrate it. This is one of the reasons I like to read novels — I get to invent the pictures. I “know” what the scene should look like. When I read a line of dialogue, I imagine how the character would “sell” the line. The novel’s descriptions stimulate my internal movie-making machinery. (I often wonder what the interior movies of movie directors look like. Do Tim Burton’s internal movies look like his external movies? Wow.)
We create our internal movies without much thought. They’re good examples of our System 1 at work. The pictures arise based on our experiences and habits. We don’t inspect them for accuracy — that would be a System 2 task. (For more on our two thinking systems, click here). Though we don’t think much about the pictures, we may take action on them. If our pictures are inaccurate, our decisions are likely to be erroneous. Our internal movies could get us in trouble.
Consider the joke … and be honest. In the movie in your head, did you see the receptionist as a woman and the doctor as a man? Now go back and re-read the joke. I was careful not to give any gender clues. If you saw the receptionist as a woman and the doctor as a man (or vice-versa), it’s because of what you believe, not because of what I said. You’re reading into the situation and your interpretation may just be erroneous. Yet again, your System 1 is leading you astray.
What does this have to do with business? I’m convinced that many of our disagreements and misunderstandings in the business world stem from our pictures. Your pictures are different from mine. Diversity in an organization promotes innovation. But it also promotes what we might call “differential picture syndrome”.
So what to do? Simple. Ask people about the pictures in their heads. When you hear the term strategic reorganization, what pictures do you see in your mind’s eye? When you hear team-building exercise, what movie plays in your head? It’s a fairly simple and effective way to understand our conceptual differences and find common definitions for the terms we use. It’s simple. Just go to the movies together.