The Psychology of Perceiving Uncertainty


Editor’s Note: This is the fourth installment in “Off Guard,” a series on surprise in war inspired by a new CSIS study. Read the rest of the series here.

Saddam Hussein and George W. Bush saw different worlds in early 2003, but shared a common belief: Each was certain that his read of the strategic situation was accurate. Saddam was certain that U.S. intelligence was nearly omniscient, and therefore would know that he harbored no weapons of mass destruction (WMD). Bush was certain that Saddam did, since if he did not, Iraq would have cooperated with inspectors. With the benefit of hindsight, we know how misplaced both beliefs were. There was more uncertainty regarding WMD than each leader realized. The same was true of the outcome of the pending invasion, which Saddam did not believe would happen until the moment it did, and the same was true of the outcome of the subsequent occupation.

Today, uncertainty is in vogue. Technically, uncertainty is the idea that the range of possible outcomes is beyond our comprehension. More simply, it’s the idea that we are missing something, though we’re not quite sure what. In the context of national security, uncertainty refers to the simple idea that the strategic landscape, America’s place in the world, and the technologies of war are shifting in ways that we do not fully understand. Uncertainty is not the same thing as lacking confidence, nor is it the same thing as risk. A roulette player faces risk, but not uncertainty — she does not know what will happen on the next spin, but does know how often she will hit red over the long run. In an uncertain world, we lack long-run odds. More importantly, we are unsure whether the game itself has changed.

Some individuals and organizations are better at perceiving uncertainty and adjusting their actions accordingly. I argue that we can predict who does and doesn’t perceive uncertainty based on a number of individual, group-level, and organizational factors. I offer recommendations for how decision-makers at each of these three levels can better perceive uncertainty.

The advice comes with the caveat that uncertainty should be taken in small doses — after all, a crucial task for political and military leaders is to convey confidence to motivate followers. Still, leaders are more likely to perceive too little uncertainty rather than too much of it. Certainty’s pull is constant — it feels better at a fundamental level — and so our deliberate attention should go to uncertainty.

As others in this series have argued, seeing uncertainty makes us less likely to fight the wrong war with the wrong tools. If we do not understand how decision-makers’ minds process uncertainty, and what makes them better or worse at perceiving it, our best intentions may be for naught.


At the individual level, perceiving uncertainty entails two psychological steps: comfort with the unknown and habits of thought that seek out complexity. Being comfortable with the unknown leaves individuals open to the idea of seeing uncertainty. Thinking in more complex terms gives individuals the means to better understand uncertain situations.

Individuals differ in how comfortable they are with the unknown simply by virtue of their personalities. Differences in openness to experience, tolerance of ambiguity, and need for cognitive closure can all affect someone’s willingness to see a situation as uncertain. These traits for openness, ambiguity tolerance, and need for closure, which tend to correlate with each other, fluctuate early in life but stabilize in adulthood. This group of traits is among the relatively few personality variables that show little systematic difference between men and women. They do, however, tend to differ by political ideology. Controversially but consistently, conservatives show less openness to the unknown.

Importantly, however, even those who are dispositionally uncomfortable with the unknown can alleviate this discomfort by mentally reframing it. This idea, known as “cognitive reappraisal,” is a surprisingly powerful tool for dealing with the effects of emotions on decision-making. Cognitive reappraisal creates distance from the emotion and a sense of control over it. Applied to uncertainty, it can be a first step toward seeing it better.

The second step in perceiving uncertainty is to think in more complex terms. The more complexity someone sees, the more opportunities there are for uncertainty to surface. An individual’s mental model can be complex in two ways: It can be made up of many dimensions, meaning lots of different factors come into play; and it can be highly integrated, meaning each factor can affect the others. Individuals high in “integrative complexity,” as researchers label it, judge the effects of their decisions as the product of many dimensions interacting at multiple levels.

For example, an analyst with high integrative complexity evaluating whether to invade Iraq would factor in the resolve of the Iraqi military, the sentiments of the Iraqi people, American military preparedness, regional stability, global commodity prices, and others. Furthermore, high integrative complexity would see each of these dimensions as interacting with the others. On the other hand, low integrative complexity flattens many dimensions into one. It substitutes a complex interaction for simple trade-offs. Had Saddam fully complied with weapons inspections or not?

Integrative complexity is partly heritable, but can also be learned. Perhaps most importantly, however, integrative complexity depends on how much an individual knows about the subject matter. Sunni-Shiite relations could not factor into an invasion decision unless a decision-maker first had the knowledge that the distinction is important in Iraqi society.


An individual’s integrative complexity also depends on the people around her. “Slam dunk,” one of the more infamous phrases from the run-up to the invasion of Iraq, helps us understand how group settings affect perceptions of uncertainty. As Bob Woodward described in Plan of Attack, after an intelligence briefing on Iraqi WMD activity, Bush was unimpressed. He asked George Tenet, his CIA director, “I’ve been told all this intelligence about having WMD and this is the best we’ve got?” Tenet rose and replied, “It’s a slam dunk case!” Moments later, he used the phrase again. Administration officials found this comforting, especially because “it was unusual for Tenet to be so certain.”

This story fits with a classic finding in psychology. Individuals who have to explain themselves to important audiences with known views show less integrative complexity than those who don’t know their audience’s thoughts. In other words, speakers conform to the opinions of their audience. Tenet seemed to know where the president stood on the question of invasion — and the phrasing of Bush’s reported comment to the CIA chief suggests he wanted to believe the weapons existed. Intentionally or not, the White House meeting became less about seeking uncertainty and more about aligning with the president’s opinion.

Even if the president is not in the room, groups tend to (but don’t always) reduce the amount of uncertainty that individuals perceive. Group settings allow individuals to avoid uncertainty by changing the question they ask themselves. Rather than ask what the future holds — an uncertain and difficult question — a member of a group can instead ask herself whether a fellow group member’s idea sounds reasonable, which is much easier to answer. “Groupthink” is the result.


Organizations affect the perception of uncertainty in a number of ways — for instance, through the incentives they offer and norms they embrace. One factor that is especially important for security organizations is whether they portray the strategic environment as “static or dynamic.” Security bureaucracies have to regularly assess both capabilities and intentions — are other states’ military capacities growing or shrinking? Are these states adversaries or allies? Individuals perceive more uncertainty when they work in organizations that paint the environment as dynamic.

In their superb article on “chronic misperception” before the invasion of Iraq, Charles Duelfer and Stephen Dyson attribute many of America’s mistakes in perception to the formation of an “enemy image.” Saddam was seen, simply and permanently, as a U.S. adversary. This belief saturated the security establishment and made it easy to overlook the uncertainty surrounding the signals Saddam sent with his behavior. Actions meant to deter Iran (Saddam’s primary concern) were seen instead as evidence of his hostile intent toward the United States.

Organizational leaders — presidents, cabinet secretaries, and commanders — all influence the formation of enemy images. Prior to the Iraq invasion, Bush and his deputies made it clear where they stood on the question of Saddam’s intentions, and this understanding trickled through the security bureaucracy.

At another level, strategy documents can also create enemy images and spread them organization-wide. For example, the current National Security Strategy and National Defense Strategy largely eliminate uncertainty about China and Russia. With few exceptions, the documents brand the two countries as adversaries. Such branding directs the attention of individuals within the defense establishment. On any given day an intelligence officer faces an immeasurable number of complicated signals — military exercises, economic maneuvers, political statements. Certainty about the intentions of China and Russia makes it easier to decide which signals to pay attention to and how to interpret them. However, this focus conceals the objective amount of uncertainty on the strategic landscape.

Whether these documents will have a long-term effect is unclear. President Donald Trump’s apparent disagreement with them undermines their effectiveness. From the perspective of creating more uncertainty, that is a good thing.

Perceiving More Uncertainty

Perceiving more uncertainty is both personally and professionally difficult. It entails more work, the uncertainty itself is not pleasant, and the policy world does not reward it. Indeed, the task of policy school is to train students to write short memos. Professionals learn to cut through the din by simplifying, not complicating. Few defense experts would choose to remain ignorant of the boss’ views in order to increase their own integrative complexity. And politicians win by displaying the confidence that comes with certainty.

All of this is to say that simply trying harder will likely be insufficient. To better perceive uncertainty, individuals, groups, and organizations will have to pursue more substantive change. One easy step for leaders at any level is to structure workflows to ensure that individuals think separately before coming together as a group. This helps group members avoid anchoring on each other’s judgments of uncertainty. The practice of working independently before group meetings may seem counterintuitive, but it is increasingly becoming consensus advice for how groups should operate. Groups are best used as a tool for refining independently generated thought, not as a tool for thought generation.

Another way to sharpen perceptions of uncertainty is through what are known as “forecasting tournaments.” These contests ask individuals to make predictions about the occurrence of specific events over specific periods of time, giving them a tangible motivation to seek out complexity in a given policy arena. Any one forecast will not be especially meaningful, but over the course of years, forecasting tournaments provide tangible measures of who is best at perceiving uncertainty, which is valuable information for organizational leaders. They also replace the incentive to avoid uncertainty with the incentive to compete. The value of forecasting tournaments is starting to be recognized in the intelligence community, but strategists at all levels should make gradable forecasts in their area of expertise.

Strategy documents should frame the intentions of adversaries and allies as dynamic. Going a step further, the future state of great power relations should be treated as undetermined — for instance, policy guidance should make clear that future relations between China and the United States will be influenced by events U.S. policymakers have not yet imagined. Such indeterminacy gives the security establishment a good reason to keep looking for new information.

To be clear, a call to keep searching for new information is not a call to do nothing. Indeterminacy must eventually give way to the present and the need to act. The idea, however, is to act while continuing to search for new information.


Returning to our point of departure, perceiving uncertainty in the right way, and understanding its psychological foundations, is a way to get out ahead of surprise. In a rapidly changing world, perceiving uncertainty is a way to avoid being caught off guard.

Committing to see more uncertainty does not mean stumbling through war in ignorance. We should seek to replace the uncertainty we perceive with knowledge of the objective truth. Too often, though, we seek to replace uncertainty with the mere feeling of knowing. After all, as long as reality does not intrude, the illusion of understanding has the same psychological payoff as actual understanding.

Reality, of course, eventually intrudes. Ensuring that we find actual understanding rather than the illusion thereof requires a paradoxical approach: We should seek out uncertainty at the same time that we seek to reduce it. We should search for objective truth while remaining skeptical of our ability to find it.


Brad DeWees is a doctoral student in public policy at Harvard University and a captain in the Air Force. The views expressed here are his alone and do not necessarily reflect those of the U.S. government or any part thereof. Twitter. LinkedIn.

Image: Photo from Executive Office of the President of the United States, Public domain, via Wikimedia Commons