Strategy. Innovation. Brand.

Critical Thinking

Culture — Would You Laugh At Your Boss?

We must be in a high PDI zone.

We must be in a high PDI zone.

Would you make fun of your boss… to her face?  If you’re from Denmark, you might. If you’re from Slovakia, probably not.

That’s one of the conclusions you might draw from the Power Distance Index (PDI), a measure of one of the “dimensions” of culture. Since the early 20th century, social scientists have worked to classify human cultures and measure how they differ from each other. The dimensions deal with fundamental concepts, like how we conceive of ourselves as individuals, the relationship between the individual and the group, how men and women relate to each other, and how we handle conflict. From these foundations, different observers have developed different numbers of cultural dimensions, ranging from a low of four to a high of nine.

Lately, I’ve been reading the work of Geert and Gert Jan Hofstede, a father-son team of cultural researchers from the Netherlands. In their book, Culture and Organization: Software of the Mind, they suggest that there are five cultural dimensions: 1) power distance; 2) individualism versus collectivism; 3) femininity versus masculinity; 4) uncertainty avoidance; 5) long-term versus short-term orientation. I’d like to look at all five of these — and their inter-relationship — over the coming weeks. Today, let’s look at power distance. (By the way, one of the reasons I like the Hofstedes is that they relate their findings to the workplace. The last several chapters of their book offer practical advice on managing in a multicultural world).

The Hofstedes define power distance as “the extent to which the less powerful … [citizens] … of a country expect and accept that power is distributed unequally.” The Hofstedes developed an index to measure power distance within a country and applied it to 74 different countries. The five countries with the highest PDI scores are Malaysia (PDI = 104), Slovakia (104), Guatemala (95), Panama (95), and Philippines (94). The five with lowest scores are German-speaking Switzerland (26), New Zealand (22), Denmark (18), Israel (13), and Austria (11). (The United States has a PDI of 40)

Power distance affects cultures in myriad ways. Countries with low PDIs generally believe that “inequalities among people should be minimized.” On the other hand, those with high PDIs generally believe that “Inequalities among people are expected and desired”. In low PDI countries, “parents treat children as equals”; in high PDI countries, “parents teach children obedience”.

In the workplace, differences between low- and high-PDI countries can be pronounced. In low-PDI countries, the workplace is generally decentralized with few supervisors. The ideal boss is a resourceful democrat and status symbols are generally frowned upon. In high-PDI countries, the situation is essentially reversed: the workplace is centralized with a large number of supervisors. The ideal boss is a benevolent autocrat and status symbols are normal and popular.

Here are two findings that struck me as very odd. In Europe and the Americas, countries with Romance languages had — on average — higher PDI scores than those with Germanic languages. In the popular imagination, Germanic countries are often perceived to be very hierarchical. The Hofstedes’ research suggests that the opposite is true. Austria (PDI= 11) has the lowest PDI of any of the 74 countries. German-speaking Switzerland has a PDI of 26 which compares to 70 for French-speaking Switzerland. Germany itself has a score of 35.

The other oddity is that PDI scores tend to drop the farther north (or the farther south) you move from the equator. Tropical lands tend to have higher power distances than temperate lands. Here are scores for some of the more northern (and southern) countries: Denmark (18), New Zealand (22), Ireland (28), Sweden (31), Norway (31), Finland (33), Australia (36), Canada (39). The Hofstedes suggest some possible reasons for this distribution but their ideas seem a bit too pat to me. While I’m scartching my head as to the cause, it may be that we’re simply measuring language differences again. None of the northern countries I’ve just listed speaks a Romance language.

So, what does it all mean? I’m still working on that. I do find it very interesting that the Nordic countries are clustered at the very low end of the PDI table while they’re at the very high end of the World Happiness Report and the Global Innovation Index. There’s something interesting in the state of Denmark. Think about that for a while. In the meantime, don’t laugh at your boss.

Critical Thinking. What’s That?

No, thanks. I don't want my doc to mis-diagnose me.

No, thanks. I don’t want my doc to mis-diagnose me.

I visited my doctor the other day. Expecting a bit of a wait, I took along Critical Thinking, a textbook for one of my courses. When I walked into the doctor’s office, he read the title and said, “Hmmm … critical thinking. What’s that?” I thought, “My doc just asked me what critical thinking is. This can’t be a good sign.”

I quelled my qualms, however, and explained what I teach in critical thinking class. He brightened up immediately and said, “Oh, that’s just like How Doctors Think.” I pulled out my Kindle and downloaded the book immediately. Understanding how doctors think might actually help me get better medical care.

So how do they think? Well, first they use shortcuts. They generally have way too much information to deal with and use rules of thumb called heuristics. Sound familiar? I’ve written several articles about rules of thumb and how they can lead us astray. (Just look for “thumb” in this website’s Blog Search box). So, the first answer is that doctors think just like us. Is that a good thing? Here are some errors that doctors commonly make:

Representation error — the patient is a picture of health. It’s not likely that those chest pains are a cause for concern. With this error, the doctor identifies a prototype that represents a cluster of characteristics. If you fit the prototype, fine. If not, the doctor may be diagnosing the prototype rather than you.

Attribution error — this often happens with negative stereotypes. The patient is disheveled and smells of booze. Therefore, the tremors are likely caused by alcohol rather than a hereditary disease that causes copper accumulation in the liver. That may be right most of the time but when it’s wrong, it’s really wrong.

Framing errors — I’ve read the patient’s medical charts and I see that she suffers from XYZ. Therefore, we’ll treat her for XYZ. The medical record forms a frame around the patient. Sometimes, doctors forget to step outside the frame and ask about other conditions that might have popped up. Sometimes the best approach is simply to say, “Let me tell you my story.”

Confirmation bias — we see things that confirm our beliefs and don’t see (or ignore) things that don’t. We all do it.

Availability bias — if you’re the 7th patient I’ve seen today and the first six all had the flu, there’s a good chance that I’ll diagnose you with flu, too. It just comes to mind easily; it’s readily available.

Affective bias — the doctor’s emotions get in the way. Sometimes these are negative emotions. (Tip: if you think your doctor feels negatively about you, get a new doctor). But positive emotions can also be harmful. I like you and I don’t want to cause you pain. Therefore, I won’t order that painful, embarrassing test — the one that might just save your life.

Sickest patient syndrome — doctors like to succeed just like anyone else does. With very sick patients, they may subconsciously conclude that they can’t be successful … and do less than their best.

The list goes on … but my space doesn’t. When I started  the book I thought it was probably written for doctors. But the author, Jerome Groopman, says it’s really for us laypeople. By understanding how doctors think, we can communicate more effectively with our physicians and help them avoid mistakes. It’s a good thought and a fun read.

The Structure of Predictions

The cost of my services may go down.

The cost of my services may go down.

I like to think about the future. So, in the past, I’ve written about scenario planning, prediction markets, resilience, and expert predictors. What have I learned in all this? Mainly, that experts regularly get it wrong. Also, that experts move in herds — one expert influences another and they begin to mutually reinforce each other. In the worst cases, we get manias, whether it’s tulip mania in 17th century Holland or mortgage mania in 21st century America. Paying ten times your annual income for a tulip bulb in 1637 is really not that different from Bank of America paying $4 billion for Countrywide.

I’ve also learned that you can (sometimes) make a lot of money by betting against the experts. The clearest description of “shorting” the experts is probably The Big Short by Michael Lewis.

I’m also forming the opinion that the reason we call people “experts” is because they study problems closely. They’re analysts; they study the details. Like college professors, they know a lot about a little. That may make them interesting dinner partners (or not) but does it make them better predictors of the future?

I’m thinking that the experts’ “close read” makes them worse predictors of the future, not better. Why? Because they go inside the frame of the problems. They pursue the internal logic of the story. Studying the internal logic of a situation can be useful but, as I pointed out in a recent article, it can also lead you astray. In addition to the internal logic, you need to step outside the frame and study the structure of the problem. If you stay inside the frame, you may well understand the internal dynamics of the issue. But, in many cases, the external dynamics are more important.

The case that I’ve been following is the cost of healthcare in the United States. The experts all seem to be pointing in the same direction: healthcare costs will continue to skyrocket and ultimately bankrupt the country. The experts are pointing in one direction so, as in the past, I think it’s useful to look in the other direction and predict that healthcare costs won’t climb as rapidly as in the past or may even go down.

Here are two interesting pieces of evidence that suggest that the experts may be wrong. The first is a report from the Altarum Institute which notes that 2012 represented the “…fourth consecutive year of record-low growth [in healthcare spending] compared to all previous years in the 50-plus years of official health spending data.” Granted, there’s a debate as to whether the slowing growth is caused by the recession or by structural changes but the experts (yikes!) suggest that at least some of the shift is structural.

The second piece of evidence is a report by Matthew Yglesias in Slate that documents the dramatic decline in spending for healthcare construction. Spending to construct new hospitals dropped precipitously in 2008 and has stayed low, even during the recovery. As Yglesias points out, construction spending is “the closest thing we have to a real-time forecast of what the future is going to look like.”

So, are the experts wrong? As Chou En Lai liked to say, it’s too soon to tell. But let’s keep an eye on them. Otherwise, we could be framed.

 

Thinking Outside the Frame

Make way!

Make way!

An ambulance, racing to the hospital, siren blaring, approaches an intersection. At the same time, from a different direction, a fire truck, racing to a fire, approaches the same intersection. From a third direction, a police car screeches toward the same intersection, responding to a burglary-in-progress call. From a fourth direction, a U.S. Mail truck trundles along to the same intersection. All four vehicles arrive at the same time at the same intersection controlled by a four-way stop sign. Who has the right of way?

The way I just told this story sets a frame around it that may (or may not) guide your thinking. You can look at the story from inside the frame or outside it. If you look inside the frame, you’ll pursue the internal logic of the story. The three emergency vehicles are all racing to  save people — from injury, from fire, or from burglary. Which one of those is the worst case? Which one deserves to go first? It’s a tough call.

On the other hand, you could look at the story outside the frame. Instead of pursuing the internal logic, you look at the structure of the story. Rather than getting drawn into the story, you look at it from a distance. One of the first things you’ll notice is that three of the vehicles belong to the same category — emergency vehicles in full crisis mode. The fourth vehicle is different — it’s a mail truck. Could that be a clue? Indeed it is. The “correct” answer to this somewhat apocryphal story is that the mail truck has the right of way. Why? It’s a federal government vehicle and takes precedence over the other, local government vehicles.

In How Doctors Think, Jerome Groopman describes how doctors think inside the frame. A young woman is diagnosed with anorexia and bulimia. Many years later, she’s doing poorly and losing weight steadily. Her medical file is six inches thick. Each time she visits a new doctor, the medical file precedes her. The new doctor reads it, discovers that she’s bulimic and anorexic and treats her accordingly. Finally, a new doctor sets aside her record, pulls out a blank sheet of paper, looks at the woman and says, “Tell me your story.” In telling her own story, the woman gives important clues that leads to a new diagnosis — she’s gluten-intolerant. The new doctor stepped outside the frame of the medical record and gained valuable insights.

According to Franco Moretti, similar frames exist in literature — they’re called books. Traditional literary analysis demands that you read books and study them very closely. Moretti, an Italian literary scholar, calls this close reading — it’s studying literature inside the frame set by the book. Moretti advocates a different approach that he calls distant reading….”understanding literature not by studying particular texts, but by aggregating and analyzing massive amounts of data.” Only by stepping back and reading ourside the frame, can we understand “…the true scope and nature of literature.”

In each of these examples we have a frame. In the first story, I set the frame for you. It’s a riddle and I was trying to trick you. In the second story, the patient’s medical record creates the frame. In the third, the book sets the frame. In each case, we can enter the frame and study the problem closely or we can step back and observe the structure of the problem. It’s often a good idea to step inside the frame — after all, you usually do want your doctor to read your medical file. But it’s also useful to step outside the frame, where you can find clues that you would never find by studying the internal logic of the problem. In fact, I think this approach can help us understand “big” predictions like the cost of healthcare. More on that next Monday.

 

 

Premature Commitment. It’s a Guy Thing.

Sin in haste. Repent at leisure.

Sin in haste. Repent at leisure.

There’s a widespread meme in American culture that guys are not good at making commitments. While that may be true in romance, it seems that the opposite — premature commitment — is a much bigger problem in business.

That’s the opinion I’ve formed from reading Paul Nutt’s book, Why Decisions Fail. Nutt analyzes 15 debacles which he defines as “… costly decisions that went very wrong and became public…” Indeed, some of the debacles — the Ford Pinto, Nestlé infant formula, Shell Oil’s Brent Spar disposal —  not only went public but created firestorms of indignation.

While each debacle had its own special set of circumstances, each also had one common feature: premature commitment. Decision makers were looking for ideas, found one that seemed to work, latched on to it, pursued it, and ignored other equally valid alternatives. Nutt doesn’t use the terminology, but in critical thinking circles this is known as satisficing or temporizing.

Here are two examples from Nutt’s book:

Ohio State University and Big Science — OSU wanted to improve its offerings (and its reputation) in Big Science. At the same time, professors in the university’s astronomy department were campaigning for a new observatory. The university’s administrators latched on to the observatory idea and pursued it, to the exclusion of other ideas. It turns out that Ohio is not a great place to build an observatory. On the other hand, Arizona is. As the idea developed, it became an OSU project to build an observatory in Arizona. Not surprisingly, the Ohio legislature asked why Ohio taxes were being spent in Arizona. It went downhill from there.

EuroDisney — Disney had opened very successful theme parks in Anaheim, Orlando, and Tokyo and sought to replicate their success in Europe. Though they considered some 200 sites, they quickly narrowed the list to the Paris area. Disney executives let it be known that it had always been “Walt’s dream” to build near Paris. Disney pursued the dream rather than closely studying the local situation. For instance, they had generated hotel revenues in their other parks. Why wouldn’t they do the same in Paris? Well… because Paris already had lots of hotel rooms and an excellent public transportation system. So, visitors saw EuroDisney as a day trip, not an overnight destination. Disney officials might have taken fair warning from an early press conference in Paris featuring their CEO, Michael Eisner. He was pelted with French cheese.

In both these cases — and all the other cases cited by Nutt —  executives rushed to judgment. As Nutt points out, they then compounded their error by misusing their resources. Instead of using resources to identify and evaluate alternatives, they invested their money and energy in studies designed to justify the alternative they had already selected.

So, what to do? When approaching a major decision, don’t satsifice. Don’t take the first idea that comes along, no matter how attractive it is. Rather, take a step back. (I often do this literally — it does affect your thinking). Ask the bigger question — What’s the best way to improve Big Science at OSU? — rather than the narrower question — What’s the best way to build a telescope?

 

My Social Media

YouTube Twitter Facebook LinkedIn

Newsletter Signup
Archives