Strategy. Innovation. Brand.

travis white

Self-Herding At Breakfast

Just like Grandma served.

Just like Grandma served.

I’ve always believed that breakfast is the most important meal of the day. Why? Because my mother told me so. Why did she believe it? Because her mother told her so. Who told her? Probably Edward Bernays, “the father of public relations.”

Is it true that breakfast is the most important meal of the day? Well, maybe not. If not, I’ve been self-herding for most of my life. I reached a decision (without much thinking) that breakfast was important. My only evidence was my mother’s advice.

Making the decision may have been a mistake. But, c’mon … she was my Mom. The more egregious mistake is that I never doubled back on the decision to see if anything had changed. I made the decision and never thought about it again. I self-herded into a set of fixed behaviors.

I also suffered from the confirmation bias. Researchers published articles from time to time confirming that breakfast is important. These studies confirmed what I already believed. Since the studies didn’t challenge my mental framework, I didn’t bother to check them closely. I just assumed that they were good science.

As it turns out, those studies were based on observations. Researchers observed people’s behavior and noted that people who ate breakfast were also generally healthier and less likely to be obese compared to people who didn’t. Clearly, breakfast is important.

But let’s think about this critically. There are at least three possible relationships between and among the variables:

  • Eating breakfast causes people to be healthier – breakfast causes health
  • Healthier people eat breakfast more than unhealthy people – health causes breakfast
  • Healthier people eat breakfast and also do other things that contribute to good health – hidden variable(s) lead to healthiness and also cause people to eat breakfast.

With observational studies, researchers can’t easily sort out what causes what.

So James Betts and his colleagues did an experimental study – as opposed to an observational study – on the relationship between breakfast and good health. (The original article is here. The popular press has also covered the story including the New York Times, Time magazine, and Outside magazine).

Betts’ research team randomly assigned people to one of two groups. One group had to eat breakfast every day; the other group was not allowed to do any such thing. This isolates the independent variable and allows us to establish causality.

The trial ran for six weeks. The result: nothing. The researchers found no major health or weight differences between the two groups.

But previous research had found a correlation between breakfast and good health. So what caused what? It was probably a cluster of hidden variables. Betts noted, for instance, “…the breakfast group was much more physically active than the fasting group, with significant differences particularly noted during light-intensity activities during the morning.”

So it may not be breakfast that creates healthier outcomes. It may be that breakfast eaters are also more physically active. Activity promotes wellness, not breakfast.

If that’s true, I’ve been self-herding for many years. I didn’t re-check my sources. If I had, I might have discovered that Edward Bernays launched a PR campaign in the 1920s to encourage people to eat a hearty breakfast, with bacon and eggs. Bernays was working for a client – Beech-Nut Packing Company – that sold pork products, including bacon. I suspect the campaign influenced my grandmother who, in turn, influenced my mother who, in turn, influenced me. The moral of the story: check your sources, re-check them periodically, and be suspicious of observational studies. And don’t believe everything that your mother tells you.

(By the way, I recently published two short articles about the effects of chocolate and sex on cognition. Both of these articles were based on observational studies. Caveat emptor).

Heard of Self-Herding?

Should've gotten Grover's Grind.

Should’ve gotten Grover’s Grind.

How many times do you need to make the same decision?

Let’s say that, on your drive to work, there are two drive-through coffee shops: Grover’s Grind and The Freckled Beauty. You try each and decide that you prefer the mocha delight from The Freckled Beauty. Why would you ever make that same decision again? It’s more efficient to make the decision once and repeat the behavior as often as needed.

Let’s change the context. You’re walking down a busy street in a big city when you see a cluster of, say, six people. They’re all looking upward and pointing to a tall building. Chances are that you’ll slow down and look up as well. The cluster of people has “herded” you into behaving the same way they behave.

Herding affects us in many ways. Teenagers wear essentially the same clothing because they want to be part of the same herd. College professors dress like college professors. Similarly, if we’re surrounded by liberals, we tend to lean liberal. If surrounded by conservatives, we tend to lean conservative. We sort ourselves into different herds based on appearances, clothing, lifestyles, political position, religion and so on.

Herding is essentially a cognitive bias. Instead of thinking through a decision and using logic to reach an advantageous conclusion, we use a shortcut (also known as a heuristic). We let the herd think for us. If it’s good enough for them, it’s probably good enough for me.

Like most cognitive biases, herding leads us to good conclusions much of the time … but not always. When it goes wrong, it does so in predictable ways. As Dan Ariely says in the title of his book, we’re Predictably Irrational.

If we think about it, it’s easy to recognize herding. With a little forethought we can defend ourselves against groupthink. But what about self-herding – a notion that Ariely developed. Can you easily recognize it? Can you defend yourself against it?

Self-herding has to do with difficult questions. Daniel Kahneman pointed out that, when we’re asked a hard question, we often substitute an easy question and answer that instead. Here’s a hard question, “How likely is it that you’ll be shot in your neighborhood?” We don’t know the answer, so we substitute an easier question: “How many neighborhood shooting incidents can I recall from memory?” If we can remember many such incidents, then we assume that a recurrence is highly probable. This is known as the availability bias – we assume that things that are easily available to memory are likely to happen again.

Self-herding is a variant of the availability bias. As Ariely points out, it’s not easy to answer a question like, “What’s the best place to eat in your neighborhood?” So we substitute an easier question, “Where have I eaten before that I really liked?” Ariely notes that, “We can consult our preferences or we can consult our memory. It turns out it’s often easier to consult our memory.”

When you continue to choose The Freckled Beauty over Grover’s Grind, you’re herding yourself. It was the right decision at one time and you assume that it continues to be the right decision. It’s an efficient way to think. It’s also easy – you use your memory rather than your thinking muscles.

But, as we all know, things change. In fact, the speed of change seems to be accelerating. If the conditions that led to our initial decision change, then the decision is no longer valid. We can miss important opportunities and make serious mistakes. Every now and then, we need to un-herd ourselves.

Innovation’s Speech Impediment

It's not a good idea if I don't understand it.

It’s not a good idea if I don’t understand it.

One of the most important obstacles to innovation is the cultural rift between technical and non-technical managers. The problem is not the technology per se, but the communication of the technology. Simply put, technologists often baffle non-technical executives and baffled executives won’t support change.

To promote innovation, we need to master the art of speaking between two different cultures: technical and non-technical. We need to find a common language and vocabulary. Most importantly, we need to speak to business needs and opportunities, not to the technology itself.

In my Managing Technology class, my students act as the CIO of a fictional company called Vair. The students study Vair’s operations (in a 12-page case study) and then recommend how technical innovations could improve business operations.

Among other things, they present a technical innovation to a non-technical audience. They always come up with interesting ideas and useful technologies. And they frequently err on the side of being too technical. Their presentations are technically sound but would be baffling to most non-technical executives.

Here are the tips I give to my students on giving a persuasive presentation to a non-technical audience. I thought you might find them useful as well.

Benefits and the so what question – we often state intermediary benefits that are meaningful to technologists but not meaningful to non-technical executives. Here’s an example, “By moving to the cloud, we can consolidate our applications”. Technologists know what that means and can intuit the benefits. Non-technical managers can’t. To get your message across, run a so what dialogue in your head,

Statement: “By moving to the cloud, we can consolidate our applications.”

Question: “So what?”

Statement: “That will allow us to achieve X.”

Question: “So what?”

Statement: “That means we can increase Y and reduce Z.”

Question: “So what?”

Statement: “Our stock price will increase by 12%”

Asking so what three or four times is usually enough to get to a logical end point that both technical and non-technical managers can easily understand.

Give context and comparisons – sometimes we have an idea in mind and present only that idea, with no comparisons. We might, for instance, present J.D. Edwards as if it’s the only choice in ERP software. If you were buying a house, you would probably look at more than one option. You want to make comparisons and judge relative value. The same holds true in a technology presentation. Executives want to believe that they’re making a choice rather than simply rubber-stamping a recommendation. You can certainly guide them toward your preferred solution. By giving them a choice, however, the executives will feel more confident that they’ve chosen wisely and, therefore, will support the recommendation more strongly.

Show, don’t tell – chances are that technologists have coined new jargon and acronyms to describe the innovation. Chances are that non-technical people in the audience won’t understand the jargon — even if they’re nodding their heads. Solution: use stories, analogies, or examples:

  • Stories – explain how the innovation came about, who invented it, and why. Put real people in the story. Explain what problems existed before the innovation arrived. How is the world better now?
  • Analogies – compare it to something that the audience knows and understands. A Service Oriented Architecture (SOA), for instance, is remarkably similar to a modular stereo system.
  • Examples – tell the story of other companies that are using the technology. What benefits have they gained?

Words, words, words – often times we prepare a script for a presentation and then put most of it on our slides. The problem is that the audience will either listen to you or read your slides. They won’t do both. You want them to listen to you – you’re much more important than the slides. You’ll need to simplify your slides. The text on the slide should capture the headline. You should tell the rest of the story.

If you follow these tips, the executives in your audience are much more likely to comprehend the innovation’s benefits. If they comprehend the benefits, they’re much more likely to support the innovation.

(If you’d like a copy of the Vair case study, just send me an e-mail. I’m happy to share it.)

Chocolate or Sex?

Such difficult choices.

Such difficult choices.

A few days ago I published a brief article, Chocolate Brain, which discussed the cognitive benefits of eating chocolate. Bottom line: people who eat chocolate (like my sister) have better cognition than people who don’t. As always, there are some caveats, but it seems that good cognition and chocolate go hand in hand.

I was headed to the chocolate store when I was stopped in my tracks by a newly published article in the journal, Age and Ageing. The title, “Sex on the brain! Associations between sexual activity and cognitive function in older age” pretty much explains it all. (Click here for the full text).

The two studies – chocolate versus sex – are remarkably parallel. Both use data collected over the years through longitudinal studies. The chocolate study looked at almost 1,000 Americans who have been studied since 1975 in the Maine-Syracuse Longitudinal Study. The sex study looked at data from almost 7,000 people who have participated in the English Longitudinal Study of Aging (ELSA).

Both longitudinal studies gather data at periodic intervals; both studies are now on wave 6. The chocolate study included people aged 23 to 98. The sex study looked only at older people, aged 50 to 89.

Both studies also used standard measures of cognition. The chocolate study used six standard measures of cognition. The sex study used two: “…number sequencing, which broadly relates to executive function, and word recall, which broadly relates to memory.”

Both studies looked at the target variable – chocolate or sex – in binary fashion. Either you ate chocolate or you didn’t; either you had sex – in the last 12 months – or you didn’t.

The results of the sex test differed by gender. Men who were sexually active had higher scores on both number sequencing and word recall tests. Sexually active women had higher scores on word recall but not number sequencing. Though the differences were statistically significant, the “…magnitude of the differences in scores was small, although this is in line with general findings in the literature.”

As with the chocolate study, the sex study establishes an association but not a cause-and-effect relationship. The researchers, led by Hayley Wright, note that the association between sex and improved cognition holds, even “… after adjusting for confounding variables such as quality of life, loneliness, depression, and physical activity.”

So the association is real but we haven’t established what causes what. Perhaps sexual activity in older people improves cognition. Or maybe older people with good cognition are more inclined to have sex. Indeed, two other research papers cited by Wright et. al, studied attitudes toward sex among older people in Padua, Italy and seemed to suggest that good cognition increased sexual interest rather than vice versa. (Click here and here). Still, Wright and her colleagues might use a statistical tool from the chocolate study. If cognition leads to sex (as opposed to the other way round), people having more sex today should have had higher cognition scores in earlier waves of the longitudinal study than did people who aren’t having as much sex today.

So, we need more research. I’m especially interested in establishing whether there are any interactive effects. Let’s assume for a moment that sexual activity improves cognition. Let’s assume the same for chocolate consumption. Does that imply that combining sex and chocolate leads to even better cognition? Could this be a situation in which 1 + 1 = 3? Raise your hand if you’d like to volunteer for the study.

Chocolate Brain

save the earth chocolateClose readers of this website will remember that my sister, Shelley, is addicted to chocolate. Perhaps it’s because of the bacteria in her microbiome. Perhaps it’s due to some weakness in her personality. Perhaps it’s not her fault; perhaps it is her fault. Mostly, I’ve written about the origins of her addiction. How did she come to be this way? (It’s a question that weighs heavily on a younger brother).

There’s another dimension that I’d like to focus on today: the outcome of her addiction. What are the results of being addicted to chocolate? As it happens, my sister is very smart. She’s also very focused and task oriented. She earned her Ph.D. in entomology when she was 25 and pregnant with her second child. Could chocolate be the cause?

I thought about this the other day when I browsed through the May issue of Appetite, a scientific journal reporting on the relationship between food and health. The tittle of the article pretty much tells the story: “Chocolate intake is associated with better cognitive function: The Maine-Syracuse Longitudinal Study”.

The Maine-Syracuse Longitudinal Study (MSLS) started in 1974 with more than 1,000 participants. Initially, the participants all resided near Syracuse, New York. The study tracks participants over time, taking detailed measurements of cardiovascular and cognitive health in “waves” usually at five-year intervals.

The initial waves of the study had little to do with diet and nothing to do with chocolate. In the sixth wave, researchers led by Georgina Crichton decided to look more closely at dietary variables. The researchers focused on chocolate because it’s rich in flavonoids and “The ability of flavonoid-rich foods to improve cognitive function has been demonstrated in both epidemiological studies … and clinical trials.” But the research record is mixed. As the authors point out, studies of “chronic” use of chocolate “…have failed to find any positive effects on cognition.”

So, does chocolate have long-term positive effects on cognition? The researchers gathered data on MSLS participants, aged 23 to 98. The selection process removed participants who suffered from dementia or had had severe strokes. The result was 968 participants who could be considered cognitively normal.

Using a questionnaire, the researcher asked participants about their dietary habits, including foods ranging from fish to vegetables to dairy to chocolate. The questionnaire didn’t measure the quantity of food that participants consumed. Rather it measured how often the participant ate the food – measured as the number of times per week. The researchers used a variety of tests to measure cognitive function.

And the results? Here’s the summary:

  • Women ate chocolate more frequently than men;
  • Those who ate chocolate consumed more calories overall “..but significantly less alcohol”.
  • “All cognitive scores were significantly higher in those who consumed chocolate at least once per week, than in those who never/rarely consumed chocolate.”
  • “Chocolate intake was significantly and positively associated with…” six different measures of cognitive function.

Seems pretty clear, eh? But this isn’t an experiment, so it’s difficult to say that chocolate caused the improved function. It could be that participants with better cognition simply chose to eat more chocolate. (Seems reasonable, doesn’t it?).

So the researchers delved a little deeper. They studied the cognitive assessments of participants who had taken part in earlier waves of the study. If cognition caused chocolate consumption (rather than the other way round), then people who eat more chocolate today should have had better cognitive scores in earlier waves of the study. That was not the case. This doesn’t necessarily prove that chocolate consumption causes better cognition. But we can probably reject the hypothesis that smarter people choose to eat more chocolate.

So what does this say about my sister? She’s still a pretty smart cookie. But she might be even smarter if she ate more chocolate. That’s a scary thought.

My Social Media

YouTube Twitter Facebook LinkedIn

Newsletter Signup