Strategy. Innovation. Brand.

Critical Thinking

The Last of Thumb Thinking

Sure it’s dangerous. But we’re in control. No problem!

Heuristics are simply rules of thumb. They help us make decisions quickly and, in most cases, accurately. They help guide us through the day. Most often, we’re not even aware that we’re making decisions. Unfortunately, some of those decisions can go haywire — precisely because we’re operating on automatic pilot. In fact, psychologists suggest that we commonly make 17 errors via heuristics. I’ve surveyed 11 of them in previous posts (click here, here, and here). Let’s look at the last six today.

Optimistic bias — can get us into a lot of trouble. It leads us to conclude that we have much more control over dangerous situations than we actually do. We underestimate our risks. This can help us when we’re starting a new company; if we knew the real odds, we might never try. It can hurt us, however, when we’re trying to estimate the danger of cliff diving.

Hindsight bias — can also get us into trouble. Everything we did that was successful is by dint of our own hard work, talent, and skill. Everything we did that was unsuccessful was because of bad luck or someone else’s failures. Overconfidence, anyone?

Elimination by aspect — you’re considering multiple options and drop one of them because of a single issue. Often called, the “one-strike-and-you’re-out” heuristic. I think of it as the opposite of satsificing. With satsificing, you jump on the first solution that comes along. With elimination by aspect, you drop an option at the first sign of a problem. Either way, you’re making decisions too quickly.

Anchoring with adjustment — I call this the “first-impressions-never-die” heuristic and I worry about it when I’m grading papers. Let’s say I give Harry a C on his first paper. That becomes an anchor point. When his second paper arrives, I run the risk of simply adjusting upward or downward from the anchor point. If the second paper is outstanding, I might just conclude that it’s a fluke. But it’s equally logical to assume that the first paper was a fluke while the second paper is more typical of Harry’s work. Time to weigh anchor.

Stereotyping — we all know this one: to judge an entire group based on a single instance. I got in an accident and a student from the University of Denver went out of her way to help me out. Therefore, all University of Denver students must be altruistic and helpful. Or the flip side: I had a bad experience with Delta Airlines, therefore, I’ll never fly Delta again. It seems to me that we make more negative stereotyping mistakes than positive ones.

All or Nothing — the risk of something happening is fairly low. In fact, it’s so low that we don’t have to account for it in our planning. It’s probably not going to happen. We don’t need to prepare for that eventuality. When I cross the street, there’s a very low probability that I’ll get hit by a car. So why bother to look both ways?

As with my other posts on heuristics and systematics biases, I relied heavily on Peter Facione’s book, Think Critically, to prepare this post. You can find it here.



The Anatomy of Debacles

I was satisficing. If only I had known.

What’s a debacle? According to Paul Nutt, it’s “…merely a botched decision that attracts attention and gets a public airing.” Nutt goes on to write that his “…research shows that half of the decisions made in business and related organizations fail.” Actually, it may be higher because, “failed decisions that avoid a public airing are apt to be covered up.”

Remember that I wrote not long ago (click here) that perhaps 70% of change management efforts fail? Now we learn that half — or more — of all business decisions fail. We’re not doing so well. Nutt has studied over 400 debacles — botched decisions that became public disasters — and has created an anatomy of why and how they happen. Nutt’s book, Why Decisions Fail, is a sobering look at how we manage our organizations and, more specifically, our decisions.

Critical thinking should help us avoid botched decisions and public debacles. I’ll be writing about critical thinking over the next several months and, from time to time, will pull ideas from Nutt’s book. Today, let’s set the stage by looking at the basics. Nutt writes that blunders happen because of three broad reasons:

  1. Failure-prone practices — Nutt claims that two-thirds of business decisions are arrived at via common practices that are known to fail. One of our problems is that we typically don’t analyze the decision-making process itself. We tend to ask, “how can we correct the problem?” rather than “how did our decision-making process lead us astray?” As an example, Nutt notes that “Nearly everyone knows that participation prompts acceptance, but participation is rarely used.”
  2. Premature commitments — Nutt writes that “Decision makers often jump at the first idea that comes along and then spend years trying to make it work.” In the vocabulary of critical thinking, this is known as satisficing or temporizing — two heuristics that can lead us astray. Nutt concludes that, “A rush to judgment is seductive and deadly and can be headed off.” I’ll write more on how to head it off in future articles.
  3. Wrong-headed investments — the basic problem here is that we spends lots of time, energy, and money to demonstrate that our decision is correct and the ideas behind it are sound. We call it an evaluation but really, it’s a justification. It’s better to invest our energy in clarifying objectives, discovering issues (and opportunities), and identifying measures of risk and benefit.

Nutt also criticizes contingency theory — the idea that your situation dictates your tactics. For instance, if you’re faced with a community boycott, you should do X; if you’re faced with cost overruns, you should do Y. Nutt concludes that, “Best practices can be followed regardless of the decision to be made or the circumstances surrounding it.” The bulk of his book outlines what those best practices are.

Of course, there’s a lot more to it. I’ll outline the highlights in future posts and put Nutt’s findings in the general context of critical thinking. I hope you’ll follow along. In the meantime, don’t make any premature commitments.

Strategy: What If Health Care Costs Go Down?

The telegraph will generate millions of jobs.

In the early 1990s, call centers were popping up around the United States like mushrooms on a dewy morning. Companies invested millions of dollars to improve customer service via well-trained, professional operators in automated centers. Several prognosticators suggested that the segment was growing so quickly that every man, woman, and child in the United States would be working in a call center by, oh say, 2010.

Of course, it didn’t happen. The Internet arrived and millions of customers chose to serve themselves. Telecommunication costs plummeted and many companies moved their call centers offshore. Call centers are still important but not nearly as pervasive in the United States as they were projected to be.

Now we’re faced with similar projections for health care costs. If current trends continue, prognosticators say, health care will consume an ever increasing portion of the American budget until everything simply falls apart. Given our experience with other “obvious trends”, I think it behooves us to ask the opposite question, what if health care costs go down?

Why would health care costs go down?  Simply put — we may just cure a few diseases.

Why am I optimistic about potential cures? Because we’re making progress on many different fronts. For instance, what if obesity isn’t a social/cultural issue but a bacteriological issue? That’s the upshot of a recent article published in The ISME Journal. To quote: “Gram-negative opportunistic pathogens in the gut may be pivotal in obesity…” (For the original article, click here. For a summary in layman’s terms, click here). In other words, having the wrong bacteria in your gut could make you fat. Neutralizing those bacteria could slim down the whole country and reduce our health care costs dramatically.

And what about cancer? Apparently, we’re learning how to “persuade” cancer cells to kill themselves. I’ve spotted several articles on this — click here, herehere, here, and here for samples. Researchers hope that training cancer cells to commit suicide could cure many cancers in one fell swoop rather than trying to knock them off one at a time.

Of course, I’m not a medical doctor and it’s exceedingly hard to predict whether or when these findings might be transformed into real solutions. But I am old enough to know that “obvious predictions” often turn out to be dead wrong. In the late 1980s, experts predicted that our crime rate would spike to new highs in the 1990s. Instead, it did exactly the opposite. Similarly, we expected Japan to dominate the world economy. That didn’t happen either. We expected call centers to dominate the labor market. Instead, demand shifted to the Internet.

In the case of health care, it’s hard to make specific predictions. But a good strategist will always ask the “opposite” question. If the whole world is predicting that X will grow in significance, the strategist will always ask, “what if the reverse is true?” You may not be able to predict the future but you can certainly prepare for it.


Golf and Logic

Relax Elliot. It’s just a birdie.

Last week, I tweeted about the ability of golfers to make putts under different conditions. I claimed that even the best golfers have a loss aversion bias. Therefore, they’re more accurate when putting for par than for birdie. If they miss a par putt, they wind up with a bogey — a painful experience. If they miss a birdie, on the other hand, they often get a par — not so bad. The pain of getting a bogey is greater than the pleasure of getting a birdie. That’s pretty much the definition of the loss aversion bias.

Several of my friends who are golfers suggested that I don’t know what I’m talking about. For instance, my buddy Nick Gomersall wrote, “When you putt for a birdie you are more relaxed, nothing to lose and you stroke it in. Now when you putt for par you have negative thoughts as you think what happens if you miss and you put a bad stroke on it.” Nick is an excellent golfer and an all-round good guy but he’s wrong.

Here’s a link to a 2009 article on the topic, “Avoiding the Agony of a ‘Bogey”: Loss Aversion in Golf — and Business“. The authors, Devin Pope and Maurice Schweitzer, gathered data on 2.5 million putts taken in 239 PGA tournaments between 2004 and 2009. They note that “par” is an excellent divider between gain and loss. Better-than-par is a gain; worse-than-par is a loss. Loss aversion theory says losses are felt more deeply than gains. Thus, a bogey should bring more pain than a birdie brings pleasure. A golfer should try harder to avoid a bogey than to achieve a birdie.

And, that’s exactly what happens. The authors conclude, “…on average, golfers make their birdie putts approximately two percentage points less often than they make comparable par putts. This finding is consistent with loss aversion; players invest more focus when putting for par to avoid encoding a loss ….” Pope and Schweitzer used a number of statistical analyses to control for variables such as distance from the hole, the player’s overall skill, confidence, nervousness, and so on. The only explanation seems to be loss aversion.

Loss aversion is essentially an illogical bias. Why does it occur? According to Schweitzer, “”Loss aversion is the systematic mistake of segregating gains and losses — evaluating decisions in isolation rather than in the aggregate — and over-weighting losses relative to gains…”

Golf, it seems, can teach us a lot about business and finance. To avoid loss aversion, you need to look at the big picture. In golf, that means your position in the tournament rather than your position on the green. In investing, that may mean focusing on your entire portfolio rather than the gains or losses of an individual stock on an individual day. As Monty Python points out, “Always look on the bright side of life.”



More Thinking on Your Thumbs

Power differential.

Remember heuristics? They’re the rules of thumb that allow us to make snap judgments, using System 1, our fast, automatic, and ever-on thinking system. They can also lead us into errors. According to psychologists there are least 17 errors that we commonly make. In previous articles, I’ve written about seven of them (click here and here). Let’s look at four more today.

Association — word association games are a lot of fun. (Actually, words are a lot of fun). But making associations and then drawing conclusions from them can get you into trouble. You say tomato and I think of the time I ate a tomato salad and got sick. I’m not going to do that again. That’s not good hygiene or good logic. The upside is that word associations can lead you to some creative thinking. You can make connections that you might otherwise have missed. And, as we all know, connections are the foundation of innovation. Just be careful about drawing conclusions.

Power differential — did you ever work for a boss with a powerful personality? Then you know something about this heuristic. Socially and politically, it may be easier to accept an argument made by a “superior authority” than it is to oppose it. It’s natural. We tend to defer to those who have more power or prestige than we do. Indeed, there’s an upside here as well. Its called group harmony. Sometimes you do need to accept your spouse’s preferences even if they differ from yours. The trick is to recognize when preferences are merely a matter of taste versus preferences that can have significant negative results. As Thomas Jefferson said, “On matters of style, swim with the current. On matters of principle, stand like a rock”.

Illusion of control — how much control do you really have over processes and people at your office? It’s probably a lot less than you think. I’ve worked with executives who think they’ve solved a problem just because they’ve given one good speech. A good speech can help but it’s usually just one  step in a long chain of activities. Here’s a tip for spotting other people who have an illusion of control. They say I much more often than we. It’s poor communication and one of the worst mistakes you can make in a job interview. (Click here for more).

Loss and risk aversion — let’s just keep doing what we’re doing. Let’s not change things … we might be worse off. Why take risks? It happens that risk aversion has a much bigger influence on economic decisions than we once thought. In Thinking Fast and Slow, Daniel Kahneman writes about our unbalanced logic when considering gain versus loss — we fear loss more than we’re attracted by gain. In general terms, the pain of a loss is about double the pleasure of a gain. So, emotionally, it takes a $200 gain to balance a $100 loss. Making 2-to-1 decisions may be good for your nerves but it often means that you’ll pass up good economic opportunities.

To prepare this article, I drew primarily on Peter Facione’s Think Critically. (Click here) Daniel Kahneman’s book is here.

My Social Media

YouTube Twitter Facebook LinkedIn

Newsletter Signup