Strategy. Innovation. Brand.

Timur Kuran

False Consensus and Preference Falsification

Bye bye!

The Soviet Union collapsed on December 26, 1991. While signs of decay had been growing, the final collapse happened with unexpected speed. The union disappeared almost overnight and surprisingly few Soviet citizens bothered to defend it. Though it had seemed stable – and persistent – even a few months earlier, it evaporated with barely a whimper.

We could (and probably will) debate for years why the USSR disappeared, I suspect that two cognitive biases — false consensus and preference falsification — were significant contributors. Simply put, many people lied. They said that they supported the regime when, in fact, they did not. When they looked to others for their opinions, those people also lied about their preferences. It seemed that people widely supported the government. They said so, didn’t they? Since a majority seemed to agree, it was reasonable to assume that the government would endure. Best to go with the flow. But when cracks in the edifice appeared, they quickly brought down the entire structure.

Why would people lie about their preferences? Partially because they believed that a consensus existed in the broader community. In such situations, one might lie because of:

  • A desire to stay in step with majority opinion — this is essentially a sociocentric bias. We enhance our self-esteem by agreeing with the majority.
  • A desire to remain politically correct – this may be fear-induced, especially in authoritarian regimes.
  • Lack of information – when information is scarce, we may assume that the majority (as we perceive it) is probably right. We should go along.

False consensus and preference falsification can lead to illogical outcomes such as the Abilene paradox. Nobody wanted to go to Abilene but each person thought that everybody else wanted to go to Abilene … so they all went. A false consensus existed and everybody played along.

We can also see this happening with the risky shift. Groups tend to make riskier decisions than individuals. Why? Oftentimes, it’s because of a false consensus. Each member of the group assumes that other members of the group favor the riskier strategy. Nobody wants to be seen as a wimp, so each member agrees. The decision is settled – everybody wants to do it. This is especially problematic in cultures that emphasize teamwork.

Reinhold Niebuhr may have originated this stream of thought in his book Moral Man and Immoral Society, originally published in 1932. Niebuhr argued that individual morality and social morality were incompatible. We make individual decisions based on our moral understanding. We make collective decisions based on our understanding of what society wants, needs, and demands. More succinctly, Reason is not the sole basis of moral virtue in men. His social impulses are more deeply rooted than his rational life.”

In 1997, the economist, Timur Kuran, updated this thinking with his book, Private Truths, Public Lies. While Niebuhr focused on the origins of such behavior, Kuran focused more attention on the outcomes. He notes that preference falsification helps preserve “widely disliked structures” and provides an “aura of stability on structures vulnerable to sudden collapse.” Further, “When the support of a policy, tradition, or regime is largely contrived, a minor event may activate a bandwagon that generates massive yet unanticipated change.”

How can we mitigate the effects of such falsification? Like other cognitive biases, I doubt that we can eliminate the bias itself. As Lady Gaga sings, we were born this way. The best we can do is to be aware of the bias and question our decisions, especially when our individual (private) preferences differ from the (perceived) preferences of the group. When someone says, “Let’s go to Abilene” we can ask, “Really? Does anybody really want to go to Abilene?” We might be surprised at the answer.

 

 

Ebola and Availability Cascades

We can't see it so it must be everywhere!

We can’t see it so it must be everywhere!

Which causes more deaths: strokes or accidents?

The way you consider this question speaks volumes about how humans think. When we don’t have data at our fingertips (i.e., most of the time), we make estimates. We do so by answering a question – but not the question we’re asked. Instead, we answer an easier question.

In fact, we make it personal and ask a question like this:

How easy is it for me to retrieve memories of people who died of strokes compared to memories of people who died by accidents?

Our logic is simple: if it’s easy to remember, there must be a lot of it. If it’s hard to remember, there must be less of it.

So, most people say that accidents cause more deaths than strokes. Actually, that’s dead wrong. As Daniel Kahneman points out, strokes cause twice as many deaths as all accidents combined.

Why would we guess wrong? Because accidents are more memorable than strokes. If you read this morning’s paper, you probably read about several accidental deaths. Can you recall reading about any deaths by stroke? Even if you read all the obituaries, it’s unlikely.

This is typically known as the availability bias – the memories are easily available to you. You can retrieve them easily and, therefore, you overestimate their frequency. Thus, we overestimate the frequency of violent crime, terrorist attacks, and government stupidity. We read about these things regularly so we assume that they’re common, everyday occurrences.

We all suffer from the availability bias. But when we suffer from it simultaneously and together, it can become an availability cascade – a form of mass hysteria. Here’s how it works. (Timur Kuran and Cass Sunstein coined the term availability cascade. I’m using Daniel Kahneman’s summary).

As Kahneman writes, an “… availability cascade is a self-sustaining chain of events, which may start from media reports of a relatively minor incident and lead up to public panic and large-scale government action.” Something goes wrong and the media reports it. It’s not an isolated incident; it could happen again. Perhaps it could affect a lot of people. Perhaps it’s an invisible killer whose effects are not evident for years. Perhaps you already have it. How would one know? Or perhaps it’s a gruesome killer that causes great suffering. Perhaps it’s not clear how one gets it. How can we protect ourselves?

Initially, the story is about the incident. But then it morphs into a meta-story. It’s about angry people who are demanding action; they’re marching in the streets and protesting in front of the White House. It’s about fear and loathing. Then experts get involved. But, of course, multiple experts never agree on anything. There are discrepancies in the stories they tell. Perhaps they don’t know what’s really going on. Perhaps they’re hiding something. Perhaps it’s a conspiracy. Perhaps we’re all going to die.

A story like this can spin out of control in a hurry. It goes viral. Since we hear about it every day, it’s easily available to our memories. Since it’s available, we assume that it’s very probable. As Kahneman points out, “…the response of the political system is guided by the intensity of public sentiment.”

Think it can’t happen in our age of instant communications? Go back and read the stories about ebola in America. It’s a classic availability cascade. Chris Christie, the governor of New Jersey, reacted quickly — not because he needed to but because of the intensity of public sentiment. Our 24-hour news cycle needs something awful to happen at least once a day. So availability cascades aren’t going to go away. They’ll just happen faster.

My Social Media

YouTube Twitter Facebook LinkedIn

Newsletter Signup
Archives