Strategy. Innovation. Brand.

applied critical thinking

Can You Frame Yourself?

No, please. Not again!

No, please. Not again!

My mother was a great lady but not a great cook. TV dinners were a popular option at our house when I was a kid. If we weren’t eating TV dinners we might have to eat … frozen fish sticks. I can still smell the oily odor of limp fish sticks frying up in the little Sunbeam electric skillet. It permeated everything. I grew up in a clean-your-plate family so, ultimately, I had to choke down those mysterious fish parts. Then, without fail, I raced to the bathroom and threw up.

How does one think critically about such a situation? In our family, we quickly ruled out several non-causes. Everyone else in the family ate fish sticks and didn’t get sick. Therefore, it couldn’t be the fish sticks. Every serving of fish sticks made me sick, so we couldn’t blame it on just one box that had gone bad. Clearly, I must be allergic to fish.

So, from the age of about six to 23, I ate no fish at all. No trout or tuna or herring or salmon or swordfish. After college, I moved to Ecuador and, from time to time, took vacations to the beach. On one such vacation, I found that there was nothing to eat locally but fish. Finally, I sat in a restaurant, braced myself for the worst, and took a bite of fish. I thought, “Wow, this is really good!”  I wasn’t allergic to fish at all … just greasy, stinky frozen fish sticks.

I had been framing myself. I made an assumption about myself based on faulty evidence and stuck with it for almost 17 years. I never thought to re-check the original assumption or re-validate the evidence. I never tried to think outside the frame. Over the past several weeks, I’ve written about the issues of what might be called “external framing” in this blog. Here are some examples:

  • Police pick up a suspect in a crime. They’re pretty sure he did it so they’re especially attuned to any evidence that incriminates him. At the same time, they ignore evidence that might point to someone else. The suspect is framed even without malicious intent.
  • A doctor reads your medical records and discovers that you suffer from XYZ. So, he treats you for XYZ without listening to your current complaints. The medical records framed you and may have prevented the doctor from seeing the whole picture.

In these cases, one person is framing another. The same thing can happen to abstract issues. I’ve noticed that Republicans and Democrats frame the same issue in very different ways.

What we forget sometimes is that we can also frame ourselves. I used to teach a course in research methods that included a mild dose of inferential statistics. Many of my students were women in their 30s and 40s who were returning to school to re-start their careers. Many of them were very nervous about the statistics. They believed they weren’t good at math. As it turned out, they did just fine. They weren’t bad at math; they had just framed themselves into believing they were bad at math.

If you believe something about yourself, you might just want to poke at it a bit. Sometimes you may just be wrong. On other occasions, you may be right — I’m still terrible at anything having to do with music. Still, it’s worth asking the question. Otherwise, you may miss out on some very tasty fish.

Pascal’s Wager and the Mediterranean Diet

Wanna bet?

Wanna bet?

I like to think of Blaise Pascal (1623 — 1662), the French mathematician, as the western world’s first practitioner of Twitter. His collected Pensées were brief, enigmatic thoughts about mathematics, religion, and philosophy. Collected after his death, they read like the tweets of the 17th century (though they were intended to be a much more comprehensive defense of religion).

In the Pensées, Pascal made his famous wager. We all bet with our lives on whether God exists or not. We can live as if God exists and practice the traditional forms and virtues of religion. Or we can do the opposite and ignore our religious duties, assuming that God does not exist. If we live as if God exists and we’re right then the rewards are infinite. If we’re wrong, the loss is finite — indeed it’s quite small. Thus, Pascal argues, it’s only rational to live a pious life. The wager is heavily stacked to that side.

In today’s world, we don’t spend much time wagering on God’s existence (perhaps we should) but we make lots of bets that are much like Pascal’s. The cumulative effects are enormous.

For instance, consider the Mediterranean diet. The diet — which features olive oil, nuts, vegetables, and fish but not much red meat — has been on our radar for a number of years now. Epidemiologists observed that people who live near the Mediterranean have a much lower rate of heart disease than would be expected. Maybe it’s the diet. Or maybe it’s something else, like religion, culture, family structure, heredity, etc.

So the evidence for the positive health effects of the diet was an observed correlation. We could see that the diet was correlated (inversely) to heart disease but we couldn’t be sure that it caused the lower rates. Maybe a hidden, third factor was in play. Still, we could make a version of Pascal’s wager: eat as if the Mediterranean diet does reduce the risk of heart disease. If we’re right, we live longer. If we’re wrong … well, we’ve missed out on a few tasty bacon cheeseburgers. Would you take the bet?

Last week, the level of evidence changed dramatically. A five-year, randomized Spanish study of nearly 7,5000 people was published. People who followed the Mediterranean diet had 30% fewer heart attacks, strokes, and deaths from heart disease than the control group. Since the study used the experimental method, we can now talk about cause and effect, not just correlation. Now will you take the bet?

Of course, there’s still some doubt about the results. It’s only one study; it hasn’t been replicated yet. It was conducted in Spain. Maybe we wouldn’t get the same results in America. Maybe there were methodological or measurement errors. Still, the evidence seems pretty strong and it points toward a version of Pascal’s classic wager.

We all make versions of Pascal’s wager every day but we rarely think about them. Perhaps it’s time that we pay more attention. Perhaps it’s time to think about what levels of evidence we need before we take the bet. Is correlation enough or do we need to prove cause and effect? Life is uncertain but perhaps we can make it more comfortable by thinking — and betting — logically. While you’re pondering that, I’m going to drizzle some olive oil over a bowl of walnuts. Drop by if you’re hungry.

Critical Thinking. What’s That?

No, thanks. I don't want my doc to mis-diagnose me.

No, thanks. I don’t want my doc to mis-diagnose me.

I visited my doctor the other day. Expecting a bit of a wait, I took along Critical Thinking, a textbook for one of my courses. When I walked into the doctor’s office, he read the title and said, “Hmmm … critical thinking. What’s that?” I thought, “My doc just asked me what critical thinking is. This can’t be a good sign.”

I quelled my qualms, however, and explained what I teach in critical thinking class. He brightened up immediately and said, “Oh, that’s just like How Doctors Think.” I pulled out my Kindle and downloaded the book immediately. Understanding how doctors think might actually help me get better medical care.

So how do they think? Well, first they use shortcuts. They generally have way too much information to deal with and use rules of thumb called heuristics. Sound familiar? I’ve written several articles about rules of thumb and how they can lead us astray. (Just look for “thumb” in this website’s Blog Search box). So, the first answer is that doctors think just like us. Is that a good thing? Here are some errors that doctors commonly make:

Representation error — the patient is a picture of health. It’s not likely that those chest pains are a cause for concern. With this error, the doctor identifies a prototype that represents a cluster of characteristics. If you fit the prototype, fine. If not, the doctor may be diagnosing the prototype rather than you.

Attribution error — this often happens with negative stereotypes. The patient is disheveled and smells of booze. Therefore, the tremors are likely caused by alcohol rather than a hereditary disease that causes copper accumulation in the liver. That may be right most of the time but when it’s wrong, it’s really wrong.

Framing errors — I’ve read the patient’s medical charts and I see that she suffers from XYZ. Therefore, we’ll treat her for XYZ. The medical record forms a frame around the patient. Sometimes, doctors forget to step outside the frame and ask about other conditions that might have popped up. Sometimes the best approach is simply to say, “Let me tell you my story.”

Confirmation bias — we see things that confirm our beliefs and don’t see (or ignore) things that don’t. We all do it.

Availability bias — if you’re the 7th patient I’ve seen today and the first six all had the flu, there’s a good chance that I’ll diagnose you with flu, too. It just comes to mind easily; it’s readily available.

Affective bias — the doctor’s emotions get in the way. Sometimes these are negative emotions. (Tip: if you think your doctor feels negatively about you, get a new doctor). But positive emotions can also be harmful. I like you and I don’t want to cause you pain. Therefore, I won’t order that painful, embarrassing test — the one that might just save your life.

Sickest patient syndrome — doctors like to succeed just like anyone else does. With very sick patients, they may subconsciously conclude that they can’t be successful … and do less than their best.

The list goes on … but my space doesn’t. When I started  the book I thought it was probably written for doctors. But the author, Jerome Groopman, says it’s really for us laypeople. By understanding how doctors think, we can communicate more effectively with our physicians and help them avoid mistakes. It’s a good thought and a fun read.

Thinking Outside the Frame

Make way!

Make way!

An ambulance, racing to the hospital, siren blaring, approaches an intersection. At the same time, from a different direction, a fire truck, racing to a fire, approaches the same intersection. From a third direction, a police car screeches toward the same intersection, responding to a burglary-in-progress call. From a fourth direction, a U.S. Mail truck trundles along to the same intersection. All four vehicles arrive at the same time at the same intersection controlled by a four-way stop sign. Who has the right of way?

The way I just told this story sets a frame around it that may (or may not) guide your thinking. You can look at the story from inside the frame or outside it. If you look inside the frame, you’ll pursue the internal logic of the story. The three emergency vehicles are all racing to  save people — from injury, from fire, or from burglary. Which one of those is the worst case? Which one deserves to go first? It’s a tough call.

On the other hand, you could look at the story outside the frame. Instead of pursuing the internal logic, you look at the structure of the story. Rather than getting drawn into the story, you look at it from a distance. One of the first things you’ll notice is that three of the vehicles belong to the same category — emergency vehicles in full crisis mode. The fourth vehicle is different — it’s a mail truck. Could that be a clue? Indeed it is. The “correct” answer to this somewhat apocryphal story is that the mail truck has the right of way. Why? It’s a federal government vehicle and takes precedence over the other, local government vehicles.

In How Doctors Think, Jerome Groopman describes how doctors think inside the frame. A young woman is diagnosed with anorexia and bulimia. Many years later, she’s doing poorly and losing weight steadily. Her medical file is six inches thick. Each time she visits a new doctor, the medical file precedes her. The new doctor reads it, discovers that she’s bulimic and anorexic and treats her accordingly. Finally, a new doctor sets aside her record, pulls out a blank sheet of paper, looks at the woman and says, “Tell me your story.” In telling her own story, the woman gives important clues that leads to a new diagnosis — she’s gluten-intolerant. The new doctor stepped outside the frame of the medical record and gained valuable insights.

According to Franco Moretti, similar frames exist in literature — they’re called books. Traditional literary analysis demands that you read books and study them very closely. Moretti, an Italian literary scholar, calls this close reading — it’s studying literature inside the frame set by the book. Moretti advocates a different approach that he calls distant reading….”understanding literature not by studying particular texts, but by aggregating and analyzing massive amounts of data.” Only by stepping back and reading ourside the frame, can we understand “…the true scope and nature of literature.”

In each of these examples we have a frame. In the first story, I set the frame for you. It’s a riddle and I was trying to trick you. In the second story, the patient’s medical record creates the frame. In the third, the book sets the frame. In each case, we can enter the frame and study the problem closely or we can step back and observe the structure of the problem. It’s often a good idea to step inside the frame — after all, you usually do want your doctor to read your medical file. But it’s also useful to step outside the frame, where you can find clues that you would never find by studying the internal logic of the problem. In fact, I think this approach can help us understand “big” predictions like the cost of healthcare. More on that next Monday.

 

 

Premature Commitment. It’s a Guy Thing.

Sin in haste. Repent at leisure.

Sin in haste. Repent at leisure.

There’s a widespread meme in American culture that guys are not good at making commitments. While that may be true in romance, it seems that the opposite — premature commitment — is a much bigger problem in business.

That’s the opinion I’ve formed from reading Paul Nutt’s book, Why Decisions Fail. Nutt analyzes 15 debacles which he defines as “… costly decisions that went very wrong and became public…” Indeed, some of the debacles — the Ford Pinto, Nestlé infant formula, Shell Oil’s Brent Spar disposal —  not only went public but created firestorms of indignation.

While each debacle had its own special set of circumstances, each also had one common feature: premature commitment. Decision makers were looking for ideas, found one that seemed to work, latched on to it, pursued it, and ignored other equally valid alternatives. Nutt doesn’t use the terminology, but in critical thinking circles this is known as satisficing or temporizing.

Here are two examples from Nutt’s book:

Ohio State University and Big Science — OSU wanted to improve its offerings (and its reputation) in Big Science. At the same time, professors in the university’s astronomy department were campaigning for a new observatory. The university’s administrators latched on to the observatory idea and pursued it, to the exclusion of other ideas. It turns out that Ohio is not a great place to build an observatory. On the other hand, Arizona is. As the idea developed, it became an OSU project to build an observatory in Arizona. Not surprisingly, the Ohio legislature asked why Ohio taxes were being spent in Arizona. It went downhill from there.

EuroDisney — Disney had opened very successful theme parks in Anaheim, Orlando, and Tokyo and sought to replicate their success in Europe. Though they considered some 200 sites, they quickly narrowed the list to the Paris area. Disney executives let it be known that it had always been “Walt’s dream” to build near Paris. Disney pursued the dream rather than closely studying the local situation. For instance, they had generated hotel revenues in their other parks. Why wouldn’t they do the same in Paris? Well… because Paris already had lots of hotel rooms and an excellent public transportation system. So, visitors saw EuroDisney as a day trip, not an overnight destination. Disney officials might have taken fair warning from an early press conference in Paris featuring their CEO, Michael Eisner. He was pelted with French cheese.

In both these cases — and all the other cases cited by Nutt —  executives rushed to judgment. As Nutt points out, they then compounded their error by misusing their resources. Instead of using resources to identify and evaluate alternatives, they invested their money and energy in studies designed to justify the alternative they had already selected.

So, what to do? When approaching a major decision, don’t satsifice. Don’t take the first idea that comes along, no matter how attractive it is. Rather, take a step back. (I often do this literally — it does affect your thinking). Ask the bigger question — What’s the best way to improve Big Science at OSU? — rather than the narrower question — What’s the best way to build a telescope?

 

My Social Media

YouTube Twitter Facebook LinkedIn

Newsletter Signup
Archives