In yesterday’s post, I discussed the difference between uncertainty and risk and how the distinction can give us better analytic tools. Today, let’s focus on what to do when uncertainty is certain and you just don’t know how “the world will behave tomorrow.” Here are some tips:
Revisit your assumptions – you assume things because you want to simplify the world and convert uncertainty into risk. If you assume that there are three possible outcomes, for instance, you can start assigning probabilities to each and treat them as risks. But what if there’s a fourth outcome that you assumed just couldn’t happen? That’s where the problem lies. It helps to list your assumptions and share them with others. Do they agree that your assumptions cover all the bases?
Study history – A few days ago I read an article about the coal industry in the United States. In 2008 and 2009, many American coal companies assumed that China’s appetite for coal was insatiable. These companies invested heavily in coal production and went bankrupt when their assumption proved wrong. Studying examples like this can help us understand our own assumptions and where our blind spots are.
Slow down – speed is the enemy of good decision-making. Step back, look around, consult with diverse analysts, and maybe even do a little yoga to relax. You won’t make good decisions when you’re hurried and stressed.
Recognize randomness – we love to make up stories to explain why things happened the way they did. Michael Mauboussin calls our desire to explain things an itch that must be scratched. Resist the temptation to scratch it. Recognizing randomness will help you expand your assumptions and deal more effectively with a wide range of possibilities.
Sharpen your observational skills – we jump to conclusions far too often because we don’t pay close attention to indicators and signals in the environment. Observation is a skill like any other – we can practice it and improve it. Here are some tips on how to do just that.
Include observers with fluid intelligence – those of us with some years of experience on our resumé have a lot of crystallized intelligence. We think we know what’s going to happen because we’ve seen it all before. We grow overconfident in our ability to predict the future. It’s a good idea to include on your team people who have more fluid intelligence. They haven’t seen it all before and, therefore, they make fewer assumptions. They don’t know that they don’t know which makes them more acute observers.
Invest in information – Nathan Bennett and James Lemoine summarize the situation in their article on VUCA: “Invest in information – collect, interpret, and share it. This works best in conjunction with structural changes, such as adding information analysis networks, that can reduce ongoing uncertainty.”
Over time, I expect to write about all four elements of a VUCA environment – Volatility, Uncertainty, Complexity, and Ambiguity. Let’s start with uncertainty, which seems somewhat different from the other elements. With volatility, complexity, and ambiguity, we know something. With uncertainty, we don’t know something.
Uncertainty is often compared to risk. I wondered why uncertainty was included in VUCA while risk was not. After all, the two concepts are similar. But, as I’ve learned, risk is about knowledge while uncertainty is about lack of knowledge. As Michael Mauboussin explains, with risk we don’t know what’s going to happen but we do know the distribution of possibilities. Uncertainty is similar in that we don’t know what will happen. The difference is that we don’t even know what could happen.
Mauboussin uses games of chance to illustrate his point. In roulette, for instance, we don’t know which number will turn up but we can gauge the probability of each number. The same is true for most card games, dice games, etc. With uncertainty, however, we have little or no idea what the outcomes might be. We may assume that we know the range of possible outcomes but we really don’t.
As Nate Silver points out, you can put a price on risk but not on uncertainty. We know the odds of drawing to an inside straight and we can use that knowledge to bet to our advantage. We can’t bet effectively on uncertainty because we don’t know what the outcomes are or how they are distributed. Silver concludes, “Risk greases the wheels of a free-market economy; uncertainty grinds them to a halt.”
Even the Federal Reserve has weighed in on this question. An article published by the Federal Reserve Bank of Saint Louis explains:
“Although risk is quantifiable, uncertainty is not. Rather, uncertainty arises from imperfect knowledge about the way the world behaves. Most importantly, uncertainty relates to the questions of how to deal with the unprecedented, and whether the world will behave tomorrow in the way as it behaved in the past.”
Understanding the difference between risk and uncertainty can help us analyze contentious issues. The Iranian nuclear treaty may provide a good example. American political leaders who support the treaty view Iran as a rational actor. The country doesn’t want to commit suicide by initiating a nuclear war. The possible outcomes are known and we can judge their probability.
Leaders who oppose the treaty see Iran as irrational – much like North Korea. They see Iran as a messianic and extremist religious state that might well be willing to risk suicide to provoke Armageddon. Outcomes are not only unpredictable; they are unknowable. No treaty could possibly account for all possible outcomes.
Placing the Iranian treaty in the risk/uncertainty framework gives us additional tools to analyze the situation and refine our strategies. It gives us tips on what to look for. The fundamental question we’re trying to answer is simple: Is Iran a rational actor or not? I think we could develop a framework that would help us answer this question rationally rather than emotionally.
The Iranian treaty is, of course, a global issue. What if you encounter uncertainty in your personal or professional life? How can you assess the situation and improve your chances? More on that tomorrow.
When I teach critical thinking, I don’t focus much attention on the environment that we’re thinking in. We learn how to identify assumptions, assess evidence, understand our biases, and reach rational conclusions. The assumption in all this (and it’s a big one) is that these critical thinking processes will work in any environment.
But will they? What if you’re working in a VUCA environment? VUCA is a trendy acronym that originated in military planning circles. How do we teach our military leaders to make good decisions in environments that are Volatile, Uncertain, Complex, and Ambiguous? In a VUCA world, the environment in which we make decisions comes to the fore and may overshadow our thinking processes.
Indeed, in a VUCA world, one might conclude that planning, strategy, logic, and critical thinking are useless. As Nathan Bennett points out, even experienced business leaders are tempted to conclude that, “Hey! It’s a crazy world out there! What’s the use of planning?” (See also here).
Interpreting VUCA as one thing can indeed be overwhelming. But VUCA isn’t one thing – it’s four things. The first step in dealing with VUCA is to analyze which elements are most salient. Then we can adjust our strategy accordingly.
Let’s look at each of the four elements of VUCA:
Volatile – things are changing quickly. We need to understand the dynamics, speed, and direction of change. Just because the environment is volatile, however, doesn’t mean that it’s unpredictable. Disruptive innovations, for instance, create volatility but not uncertainty. If we understand the dynamics of disruption, we can make remarkably good predictions.
Uncertain – the environment is unpredictable; surprises happen all too often. Note that uncertainty doesn’t necessarily imply volatility. For instance, our society is certainly changing but the pace is rather slow. What’s uncertain is the direction of change.
Complex – there are a lot of moving parts and it’s not quite clear how they’re connected or how they interact with each other. It’s impossible to tell what will happen if I flip this switch or pull that lever. We regularly see this in political and economic debates. Will lowering taxes lead to greater growth and, therefore, higher tax revenues? Well … it’s complicated. Note that complexity is not the same as uncertainty or volatility.
Ambiguous – the signs are not clear and it’s easy to misinterpret what’s actually happening. We may confuse cause and effect. For instance, people who own their own homes are less likely to commit crimes. So, a government might institute a program to help people buy their own homes with a goal of reducing the crime rate. But what if we’ve confused cause with effect? What if people who don’t commit crimes are more likely to own their own homes rather than vice-versa? Cause and effect are often ambiguous. It’s useful to study them closely.
Taking VUCA as a single, integrated phenomenon can lead to a sense of futility and hopelessness. If the world is entirely random and chaotic, what can we mere mortals do? The trick is to decompose VUCA into its component parts. Analyze each component and then start plotting a strategy. (More on this in future posts).
VUCA environments call for a good dose of fluid intelligence to complement the crystallized intelligence in your organization. They also require a strong dose of critical thinking. Indeed, the more VUCA your environment, the more critical becomes critical thinking.