When I was in graduate school, I got a heavy dose of systems thinking. The basic idea is to take a problem, break it apart, and build it up. Let’s say I’m building a house. The house clearly is a system unto itself but we can also break it into subsystems — like plumbing. Plumbing is a logically coherent system with specified inputs and outputs. We can further deconstruct plumbing into more specific subsystems, like sewage versus potable water. As we deconstruct systems into subsystems, we look for linkages. How does one subsystem contribute to another? How do they build on each other?
We can also build upward into larger systems. The house, for instance, is part of a neighborhood which, in turn, is part of a city. The neighborhood also has wastewater systems and electrical systems that the house needs to connect to. If I want to get my mail delivered, it also needs an address — part of a much larger system of geographic designators.
It turns out that systems thinking is a pretty good way to build computer programs. A subroutine that calculates your sales tax, for instance, has specified inputs and outputs. It’s not logically different from a plumbing system. In either case, we start with a problem, break it down, build it up, and find a solution that fits with other systems. Note that we start with a problem and end with a solution.
When Elliot went to architecture school, he got a heavy dose of design thinking. He’s now light years ahead of me. (Isn’t it great when your kid can teach you stuff?) I still find design thinking challenging. I think that’s because I was so heavily invested in systems thinking. Frankly, I didn’t realize how much systems thinking influenced my perspective. It’s like culture. You don’t recognize the deep influence of your own culture until you visit another culture and make comparisons. As I learned design thinking, I realized that there is a whole different way of seeing the world.
The trick with design thinking is that you begin with the solution and work your way backward to the problem. What a concept! Here’s what Wikipedia says:
“…the design way of problem solving starts with the solution in order to start to define enough of the parameters to optimize the path to the goal. The solution, then, is actually the starting point.”
And here’s what John Christopher Jones says in his classic book, Designing Designing:
“The main point of difference is that of timing. Both artists and scientists operate on the physical world as it exists in the present …Designers, on the other hand, are forever bound to treat as real that which exists only in an imagined future and have to specify ways in which the foreseen thing can be made to exist.”
Why would a business person be interested in design thinking? After all, most B-schools (and computer science programs) teach systems thinking. Unless you’re an architect, isn’t that enough? Well…. I’ve noticed that a lot of leading business thinkers now include designers on their teams. In yesterday’s post, I mentioned that A.G. Lafley of Procter & Gamble had designers (from IDEO) in his coterie of advisors. Similarly, was Steve Jobs more of a business genius or a design genius? Design thinkers give us a different way of looking at the world. Maybe we should take them more seriously in business.
As I’m teaching a course on critical thinking, I thought it would be useful to study the history of the concept. What have leading thinkers of the past conceived to be “critical thinking” and how have their perceptions changed over time?
One of the earliest — and most interesting –references that I’ve found is a sermon called the Kalama Sutta preached by the Buddha some five centuries before Christ. Often known as the “charter of free inquiry”, it lays out general tenets for discerning what is true.
Many religions hold that truth is revealed through scriptures or through institutions that are authorized to interpret scriptures. By contrast, Buddhism generally asserts that we have to ascertain truth for ourselves. So, how do we do that?
That was essentially the question that the Kalama people asked the Buddha when he passed through their village of Kesaputta. The Buddha’s sermon emphasizes the need to question statements asserted to be true. Further, the Buddha goes on to list multiple sources of error and cautions us to carefully examine assertions from those sources. According to Wikipedia, the Buddha identified the following sources of error:
Further, “Do not accept any doctrine from reverence, but first try it as gold is tried by fire.” The requires examination, reflection, and questioning and only that which is “conducive to the good” should be accepted as truth.
As Thanissaro Bhikkhu summarizes it, “any view or belief must be tested by the results it yields when put into practice; and — to guard against the possibility of any bias or limitations in one’s understanding of those results — they must further be checked against the experience of people who are wise.”
So how do the Buddhist commentaries compare to other philosophers? In the century after Buddha, Socrates is quoted as saying, “I know you won’t believe me, but the highest form of Human Excellence is to question oneself and others.” Almost 2,000 years later, Francis Bacon wrote, “Critical thinking is a desire to seek, patience to doubt, fondness to meditate, slowness to assert, readiness to consider, carefulness to dispose and set in order; and hatred for every kind of imposture.” A few hundred years later, Descartes wrote, “If you would be a real seeker after truth, it is necessary that at least once in your life you doubt, as far as possible, all things.” A hundred years after that, Voltaire wrote about the consequences of a failure of critical thinking, “Anyone who has the power to make you believe absurdities has the power to make you commit injustices.”
As the Swedes would say, there seems to be a “bright red thread” that ties all of these together. Go slowly. Ask questions. Be patient. Doubt your sources. Consider your own experience. Judge the evidence thoughtfully. For well over 2,000 years our philosophers — both Eastern and Western — have been saying essentially the same thing. It seems that we know what to do. Now all we have to do is to do it.
In his book, Thinking Fast and Slow, Daniel Kahneman has an interesting example of a heuristic bias. Read the description, then answer the question.
Steve is very shy and withdrawn, invariably helpful but with little interest in people or in the world of reality. A meek and tidy soul, he has a need for order and structure, and a passion for detail.
Is Steve more likely to be a librarian or a farmer?
I used this example in my critical thinking class the other night. About two-thirds of the students guessed that Steve is a librarian; one-third said he’s a farmer. As we debated Steve’s profession, the class focused exclusively on the information in the simple description.
Kahneman’s example illustrates two problems with the rules of thumb (heuristics) that are often associated with our System 1 thinking. The first is simply stereotyping. The description fits our widely held stereotype of male librarians. It’s easy to conclude that Steve fits the stereotype. Therefore, he must be a librarian.
The second problem is more subtle — what evidence do we use to draw a conclusion? In the class, no one asked for additional information. (This is partially because I encouraged them to reach a decision quickly. They did what their teacher asked them to do. Not always a good idea.) Rather they used the information that was available. This is often known as the availability bias — we make a decision based on the information that’s readily available to us. As it happens, male farmers in the United States outnumber male librarians by a ratio of about 20 to 1. If my students had asked about this, they might have concluded that Steve is probably a farmer — statistically at least.
The availability bias can get you into big trouble in business. To illustrate, I’ll draw on an example (somewhat paraphrased) from Paul Nutt’s book, Why Decisions Fail.
Peca Products is locked in a fierce competitive battle with its archrival, Frangro Enterprises. Peca has lost 4% market share over the past three quarters. Frangro has added 4% in the same period. A board member at Peca — a seasoned and respected business veteran — grows alarmed and concludes that Peca has a quality problem. She sends memos to the executive team saying, “We have to solve our quality problem and we have to do it now!” The executive team starts chasing down the quality issues.
The Peca Products executive team is falling into the availability trap. Because someone who is known to be smart and savvy and experienced says the company has a quality problem, the executives believe that the company has a quality problem. But what if it’s a customer service problem? Or a logistics problem? Peca’s executives may well be solving exactly the wrong problem. No one stopped to ask for additional information. Rather, they relied on the available information. After all, it came from a trusted source.
So, what to do? The first thing to remember in making any significant decision is to ask questions. It’s not enough to ask questions about the information you have. You also need to seek out additional information. Questioning also allows you to challenge a superior in a politically acceptable manner. Rather than saying “you’re wrong!” (and maybe getting fired), you can ask, “Why do you think that? What leads you to believe that we have a quality problem?” Proverbs says that “a gentle answer turneth away wrath”. So does an insightful question.
When I’m happy, I smile. A few years ago, I discovered that the reverse is also true: when I smile, I get happy. I’ve also found that standing up straight — as opposed to slouching — can improve my mood and perhaps even my productivity.
It turns out that I was on to something much bigger than I realized at the time. There’s increasing evidence that we think with our bodies as well as our minds. Some of the evidence is simply the way we express our thoughts through physical metaphors. For instance, “I’m in over my head”, “I’m up to my neck in trouble”, “He’s my right hand man”, and so on. Because we use bodily metaphors to express our mental condition, the field of “body thinking” is often referred to a metaphor theory. Perhaps more generally, it’s called embodied cognition.
In experiments reported in New Scientist and in Scientific American, we can see some interesting effects. For instance, with people from western cultures, “up” and “right” mean bigger or better while “down” or “left” mean smaller or worse. So, for instance, when volunteers were asked to call out random numbers to the beat of a metronome, their eyes moved upward when the next number was larger and moved downward when the next number was smaller. Similarly, volunteers were asked to gauge the number of objects in a container. When they were asked to lean to the left while formulating their estimate, they guessed smaller numbers on average. When leaning to the right, they guessed larger numbers.
In another experiment, volunteers were asked to move boxes either: 1) from a lower shelf to a higher shelf; or 2) from a higher shelf to a lower shelf. While moving the boxes, they were asked a simple, open-ended question, like: What were you doing yesterday? Those who were moving the boxes upward were more likely to report positive experiences. Those who were moving the boxes downward were more likely to report negative experiences.
When I speak of the future, I often gesture forward — the future is ahead of us. When I speak of the past, I often do the opposite, gesturing back over my shoulder. Interestingly, the Aymara Indians of highlands Bolivia are reported to do the opposite — pointing backward to the future and forward to the past. That seems counter-intuitive to me but the Aymara explain that the future is invisible. You can’t see it. It’s as if it’s behind you where it can quite literally sneak up on you. The past, on the other hand, is entirely visible — it’s spread out in front of you. That makes a lot of sense but my cultural training is so deeply embedded that I find it very hard to point backward to the future. It’s an interesting example of cultural differences influencing embodied cognition.
Where does this leave us? Well, it’s certainly a step away from Descartes’ formulation of the mind-body duality. The body is not simply a sensing instrument that feeds data to the mind. It also feed thoughts and influences our moods in subtle ways. Yet another reason to take good care of your body.
Heuristics are simply rules of thumb. They help us make decisions quickly and, in most cases, accurately. They help guide us through the day. Most often, we’re not even aware that we’re making decisions. Unfortunately, some of those decisions can go haywire — precisely because we’re operating on automatic pilot. In fact, psychologists suggest that we commonly make 17 errors via heuristics. I’ve surveyed 11 of them in previous posts (click here, here, and here). Let’s look at the last six today.
Optimistic bias — can get us into a lot of trouble. It leads us to conclude that we have much more control over dangerous situations than we actually do. We underestimate our risks. This can help us when we’re starting a new company; if we knew the real odds, we might never try. It can hurt us, however, when we’re trying to estimate the danger of cliff diving.
Hindsight bias — can also get us into trouble. Everything we did that was successful is by dint of our own hard work, talent, and skill. Everything we did that was unsuccessful was because of bad luck or someone else’s failures. Overconfidence, anyone?
Elimination by aspect — you’re considering multiple options and drop one of them because of a single issue. Often called, the “one-strike-and-you’re-out” heuristic. I think of it as the opposite of satsificing. With satsificing, you jump on the first solution that comes along. With elimination by aspect, you drop an option at the first sign of a problem. Either way, you’re making decisions too quickly.
Anchoring with adjustment — I call this the “first-impressions-never-die” heuristic and I worry about it when I’m grading papers. Let’s say I give Harry a C on his first paper. That becomes an anchor point. When his second paper arrives, I run the risk of simply adjusting upward or downward from the anchor point. If the second paper is outstanding, I might just conclude that it’s a fluke. But it’s equally logical to assume that the first paper was a fluke while the second paper is more typical of Harry’s work. Time to weigh anchor.
Stereotyping — we all know this one: to judge an entire group based on a single instance. I got in an accident and a student from the University of Denver went out of her way to help me out. Therefore, all University of Denver students must be altruistic and helpful. Or the flip side: I had a bad experience with Delta Airlines, therefore, I’ll never fly Delta again. It seems to me that we make more negative stereotyping mistakes than positive ones.
All or Nothing — the risk of something happening is fairly low. In fact, it’s so low that we don’t have to account for it in our planning. It’s probably not going to happen. We don’t need to prepare for that eventuality. When I cross the street, there’s a very low probability that I’ll get hit by a car. So why bother to look both ways?
As with my other posts on heuristics and systematics biases, I relied heavily on Peter Facione’s book, Think Critically, to prepare this post. You can find it here.