Strategy. Innovation. Brand.

Critical Thinking

1 2 3 53

Delayed Intuition – How To Hire Better

On a scale of 1 – 5, is he technically proficient?

Daniel Kahneman is rightly respected for discovering and documenting any number of irrational human behaviors. Prospect theory – developed by Kahneman and his colleague, Amos Tversky – has led to profound new insights in how we think, behave, and spend our money. Indeed, there’s a straight line from Kahneman and Tversky to the new discipline called Behavioral Economics.

In my humble opinion, however, one of Kahneman’s innovations has been overlooked. The innovation doesn’t have an agreed-upon name so I’m proposing that we call it the Kahneman Interview Technique or KIT.

The idea behind KIT is fairly simple. We all know about the confirmation bias – the tendency to attend to information that confirms what we already believe and to ignore information that doesn’t. Kahneman’s insight is that confirmation bias distorts job interviews.

Here’s how it works. When we meet a candidate for a job, we immediately form an impression. The distortion occurs because this first impression colors the rest of the interview. Our intuition might tell us, for instance, that the candidate is action-oriented. For the rest of the interview, we attend to clues that confirm this intuition and ignore those that don’t. Ultimately, we base our evaluation on our initial impressions and intuition, which may be sketchy at best. The result – as Google found – is that there is no relationship between an interviewer’s evaluation and a candidate’s actual performance.

To remove the distortion of our confirmation bias, KIT asks us to delay our intuition. How can we delay intuition? By focusing first on facts and figures. For any job, there are prerequisites for success that we can measure by asking factual questions. For instance, a salesperson might need to be: 1) well spoken; 2) observant; 3) technically proficient, and so on. An executive might need to be: 1) a critical thinker; 2) a good strategist; 3) a good talent finder, etc.

Before the interview, we prepare factual questions that probe these prerequisites. We begin the interview with facts and develop a score for each prerequisite – typically on a simple scale like 1 – 5. The idea is not to record what the interviewer thinks but rather to record what the candidate has actually done. This portion of the interview is based on facts, not perceptions.

Once we have a score for each dimension, we can take the interview in more qualitative directions. We can ask broader questions about the candidate’s worldview and philosophy. We can invite our intuition to enter the process. At the end of the process, Kahneman suggests that the interviewer close her eyes, reflect for a moment, and answer the question, How well would this candidate do in this particular job?

Kahneman and other researchers have found that the factual scores are much better predictors of success than traditional interviews. Interestingly, the concluding global evaluation is also a strong predictor, especially when compared with “first impression” predictions. In other words, delayed intuition is better at predicting job success than immediate intuition. It’s a good idea to keep in mind the next time you hire someone.

I first learned about the Kahneman Interview Technique several years ago when I read Kahneman’s book, Thinking Fast And Slow. But the book is filled with so many good ideas that I forgot about the interviews. I was reminded of them recently when I listened to the 100th episode of the podcast, Hidden Brain, which features an interview with Kahneman. This article draws on both sources.

Why Study Critical Thinking?

Friend or foe?

People often ask me why they should take a class in critical thinking. Their typical refrain is, “I already know how to think.” I find that the best answer is a story about the mistakes we often make.

So I offer up the following example, drawn from recent news, about very smart people who missed a critical clue because they were not thinking critically.

The story is about the conventional wisdom surrounding Alzheimer’s. We’ve known for years that people who have Alzheimer’s also have higher than normal deposits of beta amyloid plaques in their brains. These plaques build up over time and interfere with memory and cognitive processes.

The conventional wisdom holds that beta amyloid plaques are an aberration. The brain has essentially gone haywire and starts to attack itself. It’s a mistake. A key research question has been: how do we prevent this mistake from happening? It’s a difficult question to answer because we have no idea what triggered the mistake.

But recent research, led by Rudolph Tanzi and Robert Moir, considers the opposite question. What if the buildup of beta amyloid plaques is not a mistake? What if it serves some useful purpose? (Click here and here for background articles).

Pursuing this line of reasoning, Tanzi and Moir discovered the beta amyloid is actually an antimicrobial substance. It has a beneficial purpose: to attack bacteria and viruses and smother them. It’s not a mistake; it’s a defense mechanism.

Other Alzheimer’s researchers have described themselves as “gobsmacked” and “surprised” by the discovery. One said, “I never thought about it as a possibility.”

A student of critical thinking might ask, Why didn’t they think about this sooner? A key tenet of critical thinking is that one should always ask the opposite question. If conventional wisdom holds that X is true, a critical thinker would automatically ask, Is it possible that the opposite of X is true in some way?

Asking the opposite question is a simple way to identify, clarify, and check our assumptions. When the conventional wisdom is correct, it leads to a dead end. But, occasionally, asking the opposite question can lead to a Nobel Prize. Consider the case of Barry Marshall.

A doctor in Perth, Australia, Marshall was concerned about his patients’ stomach ulcers. Conventional wisdom held that bacteria couldn’t possibly live in the gastric juices of the human gut. So bacteria couldn’t possibly cause ulcers. More likely, stress and anxiety were the culprits. But Marshall asked the opposite question and discovered the bacteria now known a H. Pylori. Stress doesn’t cause ulcer, bacteria do. For asking the opposite question — and answering it — Marshall won the Nobel Prize in Medicine in 2005.

The discipline of critical thinking gives us a structure and method – almost a checklist – for how to think through complex problems. We should always ask the opposite question. We should be aware of common fallacies and cognitive biases. We should understand the basics of logic and argumentation. We should ask simple, blunt questions. We should check our egos at the door. If we do all this – and more – we tilt the odds in our favor. We prepare our minds systematically and open them to new possibilities – perhaps even the possibility of curing Alzheimer’s. That’s a good reason to study critical thinking.

RAND’s Truth Decay

Truth Decay in action.

A few days ago, the RAND Corporation — one of America’s oldest think tanks — published a report titled, Truth Decay: A Threat To Policymaking and Democracy. Though I’ve read it only once and probably don’t yet grasp all its nuances, I think it’s very relevant to our world today and want to bring it to your attention.

You can find the full report here. Here are some of the highlights. The items in italics are direct quotes. The items not in italics are my comments and opinions.

What Is Truth Decay?

Heightened disagreement about facts and analytical interpretations of data — we used to disagree about opinions. Now we increasingly disagree about facts.

The blurred line between opinion and fact — we used to separate “news” from “editorial”. It was easy to tell which was which. Today, the two are commonly mixed together.

Increased volume and influence of opinion and personal experience across the communication landscape — our channels of mass communication used to be dominated by factual reporting with some clearly labeled opinion pieces mixed in. Today the reverse seems to be true.

Diminished trust in formerly respected institutions as sources of factual information — we used to have Walter Cronkite. Now we don’t.

Why Has The Truth Decayed?

Characteristics of human information processing, such as cognitive biases — these are the same biases that we’ve been studying this quarter.

Changes in the information system, such as the rise of 24-hour news coverage, social media, and dissemination of disinformation and misleading or biased information — we used to sip information slowly. Now we’re swamped by it.

Competing demands on the educational system that challenge its ability to keep pace with information system changes — is our educational system teaching people what to think or how to think?

Polarization in politics, society, and the economy — we’ve sorted ourselves out so that we only have to meet and interact with people — and ideas — that we agree with.

It’s a bracing read and I recommend it to you.

Egotism and The Awe Drought


When did you last get goose bumps as you contemplated something magnificent? When did you last feel like a small thread in an eternal fabric? When were you last awestruck?

I ask my students these questions and most everyone can remember feeling awestruck. My students get a bit dreamy when they describe the event: the vastness of a starry night or the power of a great thunderstorm. It makes them feel small. It fills them with wonder. They’re awestruck.

But not recently. The events they describe took place long ago. My students (who are mainly in their mid-30s) can reach back years to recall an event. But I can’t think of a singe example of a student who was awestruck just last week. It was always the distant past.

I’m starting to believe that we’re in an awe drought. Though we say “awesome” frequently, we don’t experience true skin-tingling awe very often. Perhaps we’ve explained the world too thoroughly. There aren’t many mysteries left. Or perhaps we’re just too busy. We don’t spend much time contemplating the infinite. We’d rather do e-mail.

My subjective experience has some academic backing as well. Paul Piff and Dachner Keltner make the case that “that our culture today is awe-deprived.” (Click here). They also point out that people who experience awe are more generous to strangers and more willing to sacrifice for others. An awe drought has consequences.

An awe drought might also explain the growing egotism in today’s world. Awe is the natural enemy of egotism. When you’re awestruck, you don’t feel like the center of the universe. Quite the opposite – you feel like a tiny speck of dust in a vast enterprise.

Awe holds egotism in check. If awe is declining, then egotism should be booming. And indeed, it is. A number of academic studies that trace everything from song lyrics to State-of-the-Union addresses suggest that egotism is growing – at least in America and probably elsewhere as well. (Click here, here, here, and here for examples).

What causes what? Does a lack of awe spur greater egotism? Or does growing egotism stifle awe? Or is there some third variable in play? It’s hard to sort out and the answer may not be clear-cut one way or the other. As a practical matter, however, awe is easier to experiment with than is egotism. It’s hard to imagine that we could just tell people to stop being egotistic and get any meaningful results. On the other hand, a campaign to stimulate awe-inspiring experiences might just work. If we can put a dent in the awe drought, we might be able to sort out the impact on egotism.

So, let’s seek out awe-inspiring experiences and let’s encourage our friends to do the same. Let’s see what happens. I know that I, for one, would love to say “awesome” and actually mean it.

Digital Taylorism and Dumb Humans

I’m your new manager.

Years ago, I heard Jaron Lanier give a lecture that included a brief summary of the Turing Test. Lanier suggested that there are two ways that machines might pass Turing’s test of artificial intelligence. On the one hand, machines could get smarter. On the other hand, humans could get dumber.

I wonder if humans-getting-dumber is where we’re headed with digital Taylorism.

Frederick Taylor, who died just over 100 years ago, was the father of scientific management or what we would now call industrial engineering. Working in various machine shops in Philadelphia in the late 19th century, Taylor studied the problems of both human and machine productivity. In Peter Drucker’s words, Taylor “was the first man in recorded history who deemed work deserving of systematic observation and study.” His followers included both Henry Ford and Vladimir Lenin.

The promise of the original Taylorism was increased productivity and lower unit cots. The gains resulted from fundamental changes in human work habits. Taylor-trained managers, for instance, broke complex tasks into much simpler sub-tasks that could more easily be taught, measured, and monitored. As a result, productivity rose dramatically but work was also dehumanized.

According to numerous commentators, we are today seeing a resurgence of Taylorism in the digital workplace. With digital tools and the Internet of Things, we can more carefully and closely monitor individual workers. In some cases, we no longer need humans to manage humans. Machines can apply scientific management to workers better than humans can. (Click here and here for more detail).

Digital Taylorism has spawned an array of devices to measure ever-more-granular work in ever-more-granular detail. Sociometric badges are “…wearable sensing devices designed to collect data on face-to-face communication and interaction in real time.” They could deliver “…a dramatic improvement in our understanding of human behavior at unprecedented levels of granularity.”

More recently, Amazon patented a wristband that can monitor a warehouse worker’s very movement. The wristband can track where a worker’s hands are in relation to warehouse bins to monitor productivity. It can also use haptic feedback – basically buzzes and vibrations – to alert workers when they make a mistake. (Click here, here, and here for more detail).

Could digital Taylorism fulfill Lanier’s suggestion that machines will match human intelligence not because they get smarter but because humans get dumber? Could it make humans dumber?

It’s hard to say but there is some evidence that we did indeed get dumber the last time we fundamentally altered our work habits. Roughly 10,000 years ago, human brains began shrinking. Prior to that time, the average human brain was roughly 1,500 cubic centimeters. Since then, our brains have shrunk to about 1,350 cubic centimeters. As one observer points out, the amount of brain matter we’ve lost is roughly the size of tennis ball.

What happened? A leading hypothesis suggests that our brains began shrinking when we transitioned from hunter-gatherer societies to agricultural societies. Hunter-gatherers live by their wits and need big brains. Farmers don’t. As our work changed, so did our brains.

Could digital Taylorism lead to a new wave of brain shrinkage? It’s possible. In a previous article, I asked what should we do when robots replace us? Perhaps a better question is what should we do when robots manage us?

1 2 3 53
My Social Media

YouTube Twitter Facebook LinkedIn

Newsletter Signup