This week’s featured posts.
This week’s featured posts.
Which causes more deaths: strokes or accidents?
The way you consider this question speaks volumes about how humans think. When we don’t have data at our fingertips (i.e., most of the time), we make estimates. We do so by answering a question – but not the question we’re asked. Instead, we answer an easier question.
In fact, we make it personal and ask a question like this:
How easy is it for me to retrieve memories of people who died of strokes compared to memories of people who died by accidents?
Our logic is simple: if it’s easy to remember, there must be a lot of it. If it’s hard to remember, there must be less of it.
So, most people say that accidents cause more deaths than strokes. Actually, that’s dead wrong. As Daniel Kahneman points out, strokes cause twice as many deaths as all accidents combined.
Why would we guess wrong? Because accidents are more memorable than strokes. If you read this morning’s paper, you probably read about several accidental deaths. Can you recall reading about any deaths by stroke? Even if you read all the obituaries, it’s unlikely.
This is typically known as the availability bias – the memories are easily available to you. You can retrieve them easily and, therefore, you overestimate their frequency. Thus, we overestimate the frequency of violent crime, terrorist attacks, and government stupidity. We read about these things regularly so we assume that they’re common, everyday occurrences.
We all suffer from the availability bias. But when we suffer from it simultaneously and together, it can become an availability cascade – a form of mass hysteria. Here’s how it works. (Timur Kuran and Cass Sunstein coined the term availability cascade. I’m using Daniel Kahneman’s summary).
As Kahneman writes, an “… availability cascade is a self-sustaining chain of events, which may start from media reports of a relatively minor incident and lead up to public panic and large-scale government action.” Something goes wrong and the media reports it. It’s not an isolated incident; it could happen again. Perhaps it could affect a lot of people. Perhaps it’s an invisible killer whose effects are not evident for years. Perhaps you already have it. How would one know? Or perhaps it’s a gruesome killer that causes great suffering. Perhaps it’s not clear how one gets it. How can we protect ourselves?
Initially, the story is about the incident. But then it morphs into a meta-story. It’s about angry people who are demanding action; they’re marching in the streets and protesting in front of the White House. It’s about fear and loathing. Then experts get involved. But, of course, multiple experts never agree on anything. There are discrepancies in the stories they tell. Perhaps they don’t know what’s really going on. Perhaps they’re hiding something. Perhaps it’s a conspiracy. Perhaps we’re all going to die.
A story like this can spin out of control in a hurry. It goes viral. Since we hear about it every day, it’s easily available to our memories. Since it’s available, we assume that it’s very probable. As Kahneman points out, “…the response of the political system is guided by the intensity of public sentiment.”
Think it can’t happen in our age of instant communications? Go back and read the stories about ebola in America. It’s a classic availability cascade. Chris Christie, the governor of New Jersey, reacted quickly — not because he needed to but because of the intensity of public sentiment. Our 24-hour news cycle needs something awful to happen at least once a day. So availability cascades aren’t going to go away. They’ll just happen faster.
My career has been a steady diet of disruption.
Three times, disruptive innovations rocked the companies I worked for. First, the PC destroyed the word processing industry (which had destroyed the typewriter industry). Second, client/server applications disrupted host-centric applications. Third, cloud-based applications disrupted client/server applications.
Twice, my companies disrupted other companies. First, RISC processors disrupted CISC processors. Second, voice applications disrupted traditional call centers.
In 1997, a Harvard professor named Clayton Christensen (pictured) took examples like these and fashioned a theory of disruptive innovation. In The Innovator’s Dilemma, he explained how it works: Your company is doing just fine and understands exactly what customers need. You focus on offering customers more of what they want. Then an alternative comes along that offers less of what customers want but is easier to use, more convenient, and less costly, etc. You dismiss it as a toy. It eats your lunch.
The disruptive innovation typically offers less functionality than the product it disrupts. Early mobile phones offered worse voice quality than landlines. Early digital cameras produced worse pictures than film. But, they all offered something else that appealed to consumers: mobility, simplicity, immediate gratification, multi-functionality, lower cost, and so on. They were good enough on the traditional metrics and offered something new and appealing that tipped the balance.
My early experiences with disruption – before Christensen wrote his book — were especially painful. We didn’t understand what was happening to us. Why would customers choose an inferior product? We read books like Extraordinary Popular Delusions and The Madness of Crowds to try to understand. Was modern technology really nothing more than an updated version of tulip mania?
After Christensen’s book came out, we wised up a bit and learned how to defend against disruptions. It’s not easy but, at the very least, we have a theory. Still, disruptions show no sign of abating. Lyft and Uber are disrupting traditional taxi services. AirBnB is disrupting hotels. And MOOCs may be disrupting higher education (or maybe not).
Such disruption happens often enough that it seems almost intuitive to me. So, I was surprised when another Harvard professor, Jill Lepore, published a “take-down” article on disruptive innovation in a recent edition of The New Yorker.
Lepore’s article, “The Disruption Machine: What The Gospel of Innovation Gets Wrong”, appears to pick apart the foundation of Christensen’s work. Some of the examples from 1997 seem less prescient now. Some companies that were disrupted in the 90s have recovered since. (Perhaps we did get smarter). Disruptive companies, on the other hand, have not necessarily thrived. (Perhaps they, too, were disrupted).
Lepore points out that Christensen started a stock fund based on his theories in March 2000. Less than a year later, it was “quietly liquidated.” Unfortunately, she doesn’t mention that March 2000 was the very moment that the Internet bubble burst. Christensen may have had a good theory but he had terrible timing.
But what really irks Lepore is given away in her subtitle. It’s the idea that Christensen’s work has become “gospel”. People accept it on faith and try to explain everything with it. Consultants have take Christensen’s ideas to the far corners of the world. (Full disclosure: I do a bit of this myself). In all the fuss, Lepore worries that disruptive innovation has not been properly criticized. It hasn’t been picked at in the same way as, say, Darwinism.
Lepore may be right but it doesn’t mean that Christensen is wrong. In the business world, we sometimes take ideas too literally and extend them too far. As I began my career, Peters and Waterman’s In Search of Excellence was almost gospel. We readers probably fell in love a little too fast. Yet Peters and Waterman had – and still have — some real wisdom to offer. (See The Hype Cycle for how this works).
I’ve read all but one of Christensens’s books and I don’t see any evidence that he promotes his work as a be-all, end-all grand Theory of Everything. He’s made careful observations and identified patterns that occur regularly. Is it a religion? No. But Christensen offers a good explanation of how an important part of the world works. I know. I’ve been there.
Let’s say you’re an army general and you want to move 1,000 troops from Point A to Point B. You’ll probably send out two types of orders. First, you’ll send direct orders to your officers, telling them how, when, and where to move.
Second, you’ll also send advisories to other units who need to be aware of your movements, including commissary, quartermaster, and transportation units. Though they don’t report directly to you, they need to know what your troops are doing. Otherwise, supplies, food, and ammunition will be in the wrong place at the wrong time. Chaos ensues.
According to Patricia Churchland in her brief-but-insightful book, Touching a Nerve, our brains essentially behave the same way. Let’s say your brain tells your eyes to move to the right. That’s pretty simple. But you also need to let the rest of your brain know what’s happening.
When your eyes move right, your brain could interpret it in at least two ways:
1) My eyes just moved to the right, or;
2) The whole world just moved to the left.
The second interpretation is scary. The world moves in an unpredictable manner. You didn’t cause the movement. So, what did? Is someone playing a trick on you? Are malevolent spirits up to no good?
You can get an inkling of how this feels just by sitting in a car. If the car next to you rolls forward, you may feel that you’re rolling backwards. It’s a startling and unsettling experience until you realize what’s actually happening. Now imagine that all of your actions feel the same way. Your arm moves but not because of you. If you didn’t cause it, who or what did? Is it really your arm or an impostor?
We normally solve this problem by sending a memo to ourselves known as the efference copy. In essence, it’s a copy of the direct order sent to the muscle(s) in question. It lets the relevant portions of your brain know that you’re causing the action. It explains what’s going on. The world is not acting on you. You’re acting on the world.
Churchland speculates that problems with the efferent copy could be at the root of many mental disorders. (Churchland is not arguing that this is proven, only that it’s a fertile ground for research). A simple example is that we (normally) can’t tickle ourselves. We know – via the efference copy – that we’re the one taking the action. We’re making something happen. When other people tickle us, there is no efference copy. Something is happening to us. On the other hand, people with efference copy problems can indeed tickle themselves. It’s as if something is happening to them.
Similarly, we all hear voices in our heads. But most of us realize that the voice is our own. What if you didn’t? Whose voice would it be? A dead relative? God?
Ultimately, this is a question of me versus not-me. Most of us have a pretty clear idea of what me consists of. Even very young infants have a pretty clear idea of what their boundaries are. We learn to send memos to ourselves very early on. For some people, however, the memo never arrives. Chaos ensues.
I’m attracted to the opposite sex. I can’t help it. As Lady Gaga says, I was born this way. Lately, however, I’ve been reading that exposure to the opposite sex can lead to premature death, especially for males. It’s a scary thought.
As reported in the current issue of Science, the phenomenon might be called “female-induced demise” and it’s a clear cause-and-effect relationship. Researchers have shown that “… female-produced pheromones …can have detrimental effects on longevity and other age-related traits in male[s]….” Further, “It has long been known that having the opposite sex around can reduce fitness….”
You’re probably wondering, “Why didn’t someone warn me about this?” Before you get too upset, let me clarify that, so far, biologists have discovered the phenomenon only in nematode worms and fruit flies. Still, you have to wonder … today the nematode, tomorrow homo sapiens? And how many of us men haven’t been called a worm at some time in our lives?
But wait, it gets worse. Members of the opposite sex don’t even have to be physically present. Merely perceiving the opposite sex is enough to do the trick. And yes, this goes both ways: male-to-female and female-to-male. Our sense of smell seems to play a critical role. You don’t have to interact with the opposite sex to die young; you merely have to inhale their pheromones. As Science points out, “This is sufficient to decrease fat stores, increase mortality, … and decrease an animal’s overall size.”
But wait, it gets even worse. It also happens with food. Let’s say you’re on a low-calorie diet. You maintain your discipline, count your calories, and avoid fatty foods. But even the smell of fatty foods may be enough to limit the benefits of your diet. Science points out that “…just the smell of a rich diet is enough to increase mortality rate and prevent many of the benefits of a low-calorie diet.”
OK, OK … we’re talking about fruit flies and worms. And yet, you have to wonder. Is this why men have shorter life expectancies than women? Do people live longer in cultures that strictly segregate the sexes? Has Woody Allen heard about this? And do we need to amend the old saying to “Cut off your nose to spite your face … and prolong your life.”
We all may well agree that 2013 was just plain weird. So, what’s next? Well, 2014 is the 100th anniversary of the beginning of World War I. It seems that all of our recent wars result from World War I, directly or indirectly. Perhaps we should just re-name the era the Second Hundred Years War.
Are there brighter things ahead? Do we have something to look forward to? Here are some suggestions from some of my favorite sources.
Meet your genome – Science magazine suggests that the era of personal medicine is just beginning. We’ll sequence your genome to develop personalized treatments for diseases like cancer or multiple sclerosis. In fact, it won’t be long before we sequence the genome of every newborn baby, just as a matter of course.
Meet your advertiser – as medicine gets personal, so does advertising. We’re changing from broadcast adverts to narrowcast – targeting demographic slivers wherever we can find them. Soon, it will be personalcast – advertising aimed at you and only you. Brick-and-mortar stores are even developing tools to track your movements in the store and make real-time special offers based on where you are.
Meet the robots – Technology Review notes that robots are ready to take their place in the workforce. They’ll start in dangerous places like battlefield rescues, but they’ll soon be able to “integrate seamlessly and safely in human spaces.” How will they learn? By studying us.
Meet your drone rescuer – the World Bank says that drones will be a “game changer” in disaster relief. They’ll help pinpoint where the problems are and drop supplies to isolated survivors. They might even “drone-lift” survivors to safety.
Meet an extinct species – 2014 is also the 100th anniversary of the extinction of the passenger pigeon. There are plans to bring it back. What next? I wonder if a T Rex would make a good pet.
Meet Consumption 2.0 – why bother to own things? Why not just pay for each use? We see it with music streaming … why not other things? We could conceivably stream books and magazines and pay for each page we read. Similarly, I just bought a new mobile phone. But I didn’t really buy it. I bought a service that provides me a phone and the right to upgrade it once a year. With technology changing so fast, why would you buy it?
Insert your computer here – biological transistors should allow us to insert computers into any living cell. That may help us repair or replace diseased bits of soft tissue just like we can replace bones and joints today. Indeed bio-computers might help us understand our own brains better. We didn’t really understand what our hearts did until we invented pumps. We may not really understand what our brains do until we build biological computers.
Meet the tech-lash – robot, bio-brains, big data, technology-driven job destruction, loss of privacy, drones, etc. etc. Where will it all lead? According to The Economist, it will almost certainly lead to tech-lash – as the technology elite “join bankers and oilmen in public demonology … in a peasants’ revolt against the sovereigns of cyberspace.”
Meet the world champion – of course, 2014 also brings us the World Cup of football. My country is in the “group of death” and I fear that we won’t make it to the knockout round. My money’s on Germany.