Strategy. Innovation. Brand.

travis white

1 2 3 23

FOMO/JOMO. Be Here Now.

Be here now.

Julia and Elliot recently went to a wedding in Eureka, Colorado, a ghost town situated high in the San Juan Mountains. To say that Eureka is isolated is a vast understatement. Here are some things that the town doesn’t have: landlines, television, internet, wi-fi, mobile phone access, cable, newspapers, radio, and paved roads. When you’re there, you’re there.

Just before the wedding ceremony, ushers collected everyone’s cameras and mobile phones. The couple seemed to be saying to their guests: “We’re glad you’re here. We hope you’re with us fully and completely. Don’t fuss with your cameras and phones. Engage with us in a profound experience. Be here now.”

The place and the process reminded me of Daniel Kahneman’s definitions of the experiencing self and the remembering self. We can focus, engage our senses, and fully experience an activity. What we remember, however, is far different from what we experience. We typically remember two things: 1) the peak experience – the high point of the activity; 2) the end state – how things ended up. (For more on memory traps, click here).

The difference between the two selves has many implications. We remember things differently than they actually happened. This calls into question such things as eyewitness testimony and historical accounts. It may also be why we argue with our spouses – we simply remember things differently. It’s another good reason not to argue in the past tense. (For other reasons, click here).

The difference also affects how we plan our activities. We can plan to: 1) enhance the experience; or 2) enhance our memory of the experience. Let’s say you go to your favorite restaurant. If you want to enhance the experience, you should order your favorite dish. You can enjoy the anticipation and the experience itself. However, you won’t create a new memory. It will simply blend in with all the other times you’ve ordered the same dish. If you want a new memory, order something you’ve never had before. It may be great or not – you can’t anticipate – but it will be more memorable.*

FOMO, of course, is the Fear of Missing Out, which seems to be an increasing concern in today’s society. Everything is accessible and we don’t want to miss any of it. Technologies such a mobile phones, video chat, and instant messages democratize our experience. We can share anything with anyone at any time. We won’t miss a thing.

But, of course, we do miss things. In fact, the very act of inserting technologies into our experiences makes us miss some of the experience. We’re fussing with our cameras rather than experiencing the action. We’re checking baseball scores rather than engaging with others. The desire to miss nothing causes us to miss something: the intensity of the present moment.

FOMO shifts our attention from the experiencing self to the remembering self. We take pictures, which helps us remember and share an experience. But the act of taking pictures insulates us from the experience itself. We’ve inserted technology between our experiences and ourselves.

As you can probably guess, I’m not the first person to suggest that FOMO mania actually causes us to miss much more than we realize. The tech and culture blogger, Anil Dash, coined the term JOMO — the Joy Of Missing Out – more than two years ago. Christina Crook wrote a book called The Joy of Missing Out and popularized the idea of Internet fasts. Sarah Wilson points out that FOMO has eradicated traditional boundaries that separated public time from private time. It used to be easy to spend a quiet evening at home. Now we need to declare an Internet fast to get some alone time.

Though it’s not a new idea, I suspect that the JOMO message needs some more evangelists.  As a famous American once said: Be Here Now.

*I adopted the restaurant example from an episode of the Hidden Brain podcast called Hungry, Hungry Hippocampus.

Information Avoidance and Persuasion

Don’t tell me.

The 1989 Tour de France was decided in the last stage, a 15.2 mile time trial into Paris. The leader, Laurent Fignon, held a fifty second advantage over Greg LeMond. Both riders were strong time trialers. To make up fifty seconds in such a short race seemed impossible. Most observers assumed that Fignon would hold his lead and win the overall title.

In most time trials, coaches radio the riders to inform them of their speed, splits, and competitive position. In this final time trial, however, LeMond turned off his radio. He didn’t want to know. He feared that, if he knew too much, he might ease up. Instead, he raced flat out for the entire distance, averaging 33.9 miles per hour, a record at the time. In a stunning finish, LeMond gained 58 seconds on Fignon and won the race by a scant eight seconds. (Here’s a terrific video recap of the final stage).

LeMond’s strategy is today known as information avoidance. He chose not to accept information that he knew was freely available to him. LeMond knew that he might be distracted by the information. He chose instead to focus solely on his own performance – the only variable that he could control.

While information avoidance worked for LeMond, the strategy often yields suboptimal outcomes. We choose not to know something and the not knowing creates health hazards, financial obstacles, and a series of unfortunate events. Here are some examples.

  • In a study of 7,000 employees at a large non-profit organization, Giulio Zanella and Ritesh Banerjee found that women are less likely to get a mammogram when one of their co-workers is diagnosed with breast cancer. The mammogram rate dropped by approximately eight percent and the effect lasted for at least two years. (Click here).
  • Amanda Ganguly and Josh Tasoff offered students tests to determine if they carried the herpes simplex virus. Though the tests were free and readily available, about five percent of the students refused the test for the HSV1 form of the virus. Fifteen percent refused the test for the HSV2 form of the virus, which is widely regarded as the “nastier” version. In other words, the scarier the disease, the more likely people are to avoid information about it. (Click here).
  • Marianne Andries and Valentin Haddad investigated similar effects in financial decisions. They found that “…information averse investors observe the value of their portfolios infrequently; inattention is more pronounced …in periods of low or volatile stock prices.” Again, the scarier the situation, the less likely people are to search for information about it. (Click here).
  • Russell Golman, David Hagmann, and George Loewenstein also investigated economic decision making and identified five information avoidance techniques: 1) physical avoidance; 2) inattention; 3) biased interpretation; 4) forgetting; 5) self-handicapping. (Click here).

In some ways, information avoidance is the flip side of the confirmation bias. We accept information that confirms our beliefs and avoid information that doesn’t. But there seems to be more to avoidance than simply the desire to avoid disconfirming information. Other contributors include:

  • Focus and fatalism – why learn something that we can do nothing about? I suspect that this was LeMond’s motivation. He couldn’t do anything about the information, so why receive it? Instead he focused on what he could do.
  • Anxiety – why learn something that will simply make us anxious? The scarier it is, the more anxious we’ll be. We’ve all put off visits to the doctor because we just don’t want to know.
  • Ego threat – why learn something that will shake our confidence in our own abilities? It seems, for instance, that poor teachers are less likely to pay attention to student evaluations than are good teachers.

Information avoidance can also teach us about persuasion. If we want to persuade people to change their opinion about something, making it scarier is probably self-defeating. People will be more likely to avoid the information rather than seeking it out. Similarly, bombarding people with more and more information is likely to be counter-productive. People under bombardment become defensive rather than open-minded.

As Aristotle noted, persuasion consists of three facets: 1) ethos (credibility); 2) pathos (emotional connection); 3) logos (logic and information). Today, we often seek to persuade with logos – information and logic. But Aristotle taught that logos is the least persuasive facet. We typically use logos to justify a decision rather than to make a decision. Ethos and pathos are much more influential in making the decision. The recent research on information avoidance suggests that we’ll persuade more people with ethos and pathos than we ever will with logos. Aristotle was right.

Greg LeMond’s example shows that information avoidance can provide important benefits. But, as we develop our communication strategies, let’s keep the downsides in mind. We need to package our arguments in ways that will reduce information avoidance and lead to a healthier exchange of ideas.

Concept Creep and Pessimism

What’s your definition of blue?

My friend, Andy, once taught in the Semester at Sea program. The program has an ocean-going ship and brings undergraduates together for a semester of sea travel, classes, and port calls. Andy told me that he was fascinated watching these youngsters come together and form in-groups and out-groups. The cliques were fairly stable while the ship was at sea but more fluid when it was in port.

Andy told me, for instance, that some of the women described some of the men as “Ship cute, shore ugly.” The very concept of “cute” was flexible and depended entirely on context. When at sea, a limited supply of men caused the “cute” definition to expand. In port, with more men available, the definition of cute became more stringent.

We usually think of concepts as more-or-less fixed. They’re unlike other processes that expand over time. The military, for instance is familiar with “mission creep” – a mission may start with small and well-defined objectives but they often grow over time. Similarly, software developers understand “feature creep” – new features are added as the software is developed.  But do concepts creep? The Semester at Sea example suggests that they do, depending on prevalence.

This was also the finding of a research paper published in a recent issue of Science magazine. (Click here). Led by David Levari, the researchers showed that “… people often respond to decreases in the prevalence of a stimulus by expanding their concept of it.” In the Semester at Sea example, as the stimulus (men) decreases, the concept of cute expands. According to Levari, et. al., this is a common phenomenon and not just related to hormonal youngsters isolated on a ship.

The researchers started with a very neutral stimulus – the color of dots. They presented 1,000 dots ranging in color from purple to blue and asked participants to identify the blue ones. They repeated the trial several hundred times. Participants were remarkably consistent in each trial. Dots identified as blue in the first trials were still identified as blue in the last trials.

The researchers then repeated the trials while reducing the number of blue dots. Would participants in the second set of trials – with decreased stimulus — expand their definition of “blue” and identify dots as blue that they had originally identified as purple? Indeed, they would. In fact, the number of purple-to-blue “crossover” dots was remarkably consistent through numerous trials.

The researchers also varied the instructions for the comparisons. In the first study, participants were told that the number of blue dots “might change” in the second pass. In a second study, participants were told that the number of blue dots would “definitely decrease.” In a third study, participants were instructed to “be consistent” and were offered monetary rewards for doing so. In some studies the number of blue dots declined gradually. In others, the blue dots decreased abruptly. These procedural changes had virtually no impact on the results. In all cases, declining numbers of blue dots resulted in an expanded definition of “blue”.

Does concept creep extend beyond dots? The researchers did similar trials with 800 images of human faces that had been rated on a continuum from “very threatening” to “not very threatening.” The results were essentially the same as the dot studies. When the researchers reduced the number of threatening faces, participants expanded their definition of “threatening.”

All these tests used visual stimuli. Does concept creep also apply to nonvisual stimuli? To test this, the researchers asked participants to evaluate whether 240 research proposals were ethical or not. The results were essentially the same. When the participants saw many unethical proposals, their definition of ethics was fairly stringent. When they saw fewer unethical proposals, their definition expanded.

It seems then that “prevalence-induced concept change” – as the researchers label it – is probably common in human behavior. Could this help explain some of the pessimism in today’s world? For example, numerous sources verify that the crime rate in the United States has declined over the past two decade. (See herehere, and here, for example). Yet many people believe that the crime rate has soared. Could concept creep be part of the problem? It certainly seems likely.

Yet again, our perception of reality differs from actual reality. Like other cognitive biases, concept creep distorts our perception in predictable ways. As the number of stimuli – from cute to blue to ethical – goes down, we expand our definition of what the concept actually means. As “bad news” decreases, we expand our definition of what is “bad”. No wonder we’re pessimistic.

Delayed Intuition – How To Hire Better

On a scale of 1 – 5, is he technically proficient?

Daniel Kahneman is rightly respected for discovering and documenting any number of irrational human behaviors. Prospect theory – developed by Kahneman and his colleague, Amos Tversky – has led to profound new insights in how we think, behave, and spend our money. Indeed, there’s a straight line from Kahneman and Tversky to the new discipline called Behavioral Economics.

In my humble opinion, however, one of Kahneman’s innovations has been overlooked. The innovation doesn’t have an agreed-upon name so I’m proposing that we call it the Kahneman Interview Technique or KIT.

The idea behind KIT is fairly simple. We all know about the confirmation bias – the tendency to attend to information that confirms what we already believe and to ignore information that doesn’t. Kahneman’s insight is that confirmation bias distorts job interviews.

Here’s how it works. When we meet a candidate for a job, we immediately form an impression. The distortion occurs because this first impression colors the rest of the interview. Our intuition might tell us, for instance, that the candidate is action-oriented. For the rest of the interview, we attend to clues that confirm this intuition and ignore those that don’t. Ultimately, we base our evaluation on our initial impressions and intuition, which may be sketchy at best. The result – as Google found – is that there is no relationship between an interviewer’s evaluation and a candidate’s actual performance.

To remove the distortion of our confirmation bias, KIT asks us to delay our intuition. How can we delay intuition? By focusing first on facts and figures. For any job, there are prerequisites for success that we can measure by asking factual questions. For instance, a salesperson might need to be: 1) well spoken; 2) observant; 3) technically proficient, and so on. An executive might need to be: 1) a critical thinker; 2) a good strategist; 3) a good talent finder, etc.

Before the interview, we prepare factual questions that probe these prerequisites. We begin the interview with facts and develop a score for each prerequisite – typically on a simple scale like 1 – 5. The idea is not to record what the interviewer thinks but rather to record what the candidate has actually done. This portion of the interview is based on facts, not perceptions.

Once we have a score for each dimension, we can take the interview in more qualitative directions. We can ask broader questions about the candidate’s worldview and philosophy. We can invite our intuition to enter the process. At the end of the process, Kahneman suggests that the interviewer close her eyes, reflect for a moment, and answer the question, How well would this candidate do in this particular job?

Kahneman and other researchers have found that the factual scores are much better predictors of success than traditional interviews. Interestingly, the concluding global evaluation is also a strong predictor, especially when compared with “first impression” predictions. In other words, delayed intuition is better at predicting job success than immediate intuition. It’s a good idea to keep in mind the next time you hire someone.

I first learned about the Kahneman Interview Technique several years ago when I read Kahneman’s book, Thinking Fast And Slow. But the book is filled with so many good ideas that I forgot about the interviews. I was reminded of them recently when I listened to the 100th episode of the podcast, Hidden Brain, which features an interview with Kahneman. This article draws on both sources.

Why Study Critical Thinking?

Friend or foe?

People often ask me why they should take a class in critical thinking. Their typical refrain is, “I already know how to think.” I find that the best answer is a story about the mistakes we often make.

So I offer up the following example, drawn from recent news, about very smart people who missed a critical clue because they were not thinking critically.

The story is about the conventional wisdom surrounding Alzheimer’s. We’ve known for years that people who have Alzheimer’s also have higher than normal deposits of beta amyloid plaques in their brains. These plaques build up over time and interfere with memory and cognitive processes.

The conventional wisdom holds that beta amyloid plaques are an aberration. The brain has essentially gone haywire and starts to attack itself. It’s a mistake. A key research question has been: how do we prevent this mistake from happening? It’s a difficult question to answer because we have no idea what triggered the mistake.

But recent research, led by Rudolph Tanzi and Robert Moir, considers the opposite question. What if the buildup of beta amyloid plaques is not a mistake? What if it serves some useful purpose? (Click here and here for background articles).

Pursuing this line of reasoning, Tanzi and Moir discovered the beta amyloid is actually an antimicrobial substance. It has a beneficial purpose: to attack bacteria and viruses and smother them. It’s not a mistake; it’s a defense mechanism.

Other Alzheimer’s researchers have described themselves as “gobsmacked” and “surprised” by the discovery. One said, “I never thought about it as a possibility.”

A student of critical thinking might ask, Why didn’t they think about this sooner? A key tenet of critical thinking is that one should always ask the opposite question. If conventional wisdom holds that X is true, a critical thinker would automatically ask, Is it possible that the opposite of X is true in some way?

Asking the opposite question is a simple way to identify, clarify, and check our assumptions. When the conventional wisdom is correct, it leads to a dead end. But, occasionally, asking the opposite question can lead to a Nobel Prize. Consider the case of Barry Marshall.

A doctor in Perth, Australia, Marshall was concerned about his patients’ stomach ulcers. Conventional wisdom held that bacteria couldn’t possibly live in the gastric juices of the human gut. So bacteria couldn’t possibly cause ulcers. More likely, stress and anxiety were the culprits. But Marshall asked the opposite question and discovered the bacteria now known a H. Pylori. Stress doesn’t cause ulcer, bacteria do. For asking the opposite question — and answering it — Marshall won the Nobel Prize in Medicine in 2005.

The discipline of critical thinking gives us a structure and method – almost a checklist – for how to think through complex problems. We should always ask the opposite question. We should be aware of common fallacies and cognitive biases. We should understand the basics of logic and argumentation. We should ask simple, blunt questions. We should check our egos at the door. If we do all this – and more – we tilt the odds in our favor. We prepare our minds systematically and open them to new possibilities – perhaps even the possibility of curing Alzheimer’s. That’s a good reason to study critical thinking.

1 2 3 23
My Social Media

YouTube Twitter Facebook LinkedIn

Newsletter Signup
Archives