The Fog of Organizations
Graham Allison and Philip Zelikow, Essence of Decision: Explaining the Cuban Missile Crisis
There are very few professionals in any field who wouldn’t benefit by reading Graham Allison and Philip Zelikow’s classic, Essence of Decision: Explaining the Cuban Missile Crisis (ensure you get the updated second edition; all parenthetical cites below refer to this edition). While their titular case and main concern is in the realm of national security decision-making, the book is really more of a synthesis of the literature on international relationships, game and decision theory, some degree of economic theory on preferences and the firm, and organizational culture and behavior. As such, their work offers a rich array of lessons taken from a wide variety of sources that can be applied to both private and public sector and range well beyond national security and international affairs. For me, the biggest takeaway is the book’s exposition of the fog of organizational behavior. Given humans’ cognitive bias to assume decisions are made and actions pursued by unitary rational actors, we should factor the fog of organizational behavior heavily in any analysis, planning, or decision-making we do.
Great leaps in technology, surveillance means, and computing power have buoyed hopes that the fog of war will be lifted, that markets will approach the perfectly informed state of neo-classical theory, and that big data can make complex social phenomena suddenly understandable. As Allison and Zelikow so aptly demonstrate, however, decisions and actions undertaken by large, complex organizations are a fusion of so many elements of culture, interest, rules, biases, path-dependencies, interactions, and so forth that aggregated human behavior will never approximate that of a unitary rational actor. No matter how good the information and communication technology, no matter how good the underlying data, theories, and premises, the fog of organizational behavior will never be lifted.
We must not overstate the case and be accused of making a straw man argument. Few serious professionals would go to the extremes of basing analysis on true unitary rational actor assumptions—or the proverbial billiard ball or black box realist approach to international relations. No, we are all cleverer than that. Yet, there is an overwhelming and unavoidable human bias toward the unitary rational actor. As individuals, this is how we reason. Whether commanders or executives, we learn from our earliest days to put ourselves—that is myself—into our competitor’s shoes when trying to anticipate their actions. When facing an organizational problem, our first instinct is to ask, “What would I do to resolve this issue?”
Critically, we don’t ask something like, “What set of partially tailored actions could the subgroups of my organization undertake based on their own programs, standard procedures, and biases, under the imperfect control and coordination of a set of leaders with varying agendas and powers, to achieve the best of a range of suboptimal outcomes?” Yet this perhaps approximates what we should be asking when trying to understand others’ actions and when trying to determine what options our organization really has at its disposal. We are programmed, however, to err more toward the unitary rational actor model, meaning we often misunderstand the intent behind others’ actions, improperly forecast their future moves, and tend to grossly overestimate our own organization’s ability to deal rationally and efficiently with complex problems.
This is, of course, the subject of many books. I can only address the high points of what may be the most important book on the subject—at least from a national security perspective—in the shallowest way here. The exercise still has value, though. To many, the lessons will be nothing new. In a way, they are a repetition of the exasperated complaints that are familiar to anyone who has dealt with a bureaucracy. Yet this is the very reason why they are so important. Though we are familiar with the messy outcomes of complex organizations undertaking complex tasks, we often analyze and plan as if this is the exception, not the norm. If we want better results, we need to be more realistic about what organizations can and cannot do, especially given a murky environment in which little can be trusted as fact.
Organizations seek the mean
Larger organizations are built to create predictable results across a relatively narrow band of operating environments. This tendency can be illustrated by the much-discussed issue of talent management in large organizations. While many commentators bemoan the fact that large organizations are often unable to properly utilize highly talented individuals, the reality is that most organizations are not built to maximize talent, but to minimize deviation. As Allison and Zelikow write, “A major purpose of organizing is to ensure that any of the operators, whatever their unique preferences and gifts, can interchangeably and successfully perform normal tasks on any given day. If you need to know the name of the pilot to determine whether the flight will be safe, or even arrive at all, then the airline has failed” (147, emphasis mine). This is an excellent example and a reminder of what larger organizations are built to do. They normalize actions and minimize deviations.
To do this, organizations simplify, smooth, and standardize. They utilize a variety of tools such as human resources processes, standards, standing operating procedures, rules, and norms to create a uniform organizational culture and operating paradigms. The paradigms and culture situate the organization, its operational objectives, and its capabilities in a certain “market” where it is most comfortable and most dominant. Within a certain range of inputs (talent, resources, issues to be addressed, etc), successful organizations produce an output of highly predictable quality.
Within an organization, then, actors do not necessarily follow a logic of efficiency, rationally and uniquely crafting their actions to maximize results in each separate scenario. Instead, they follow what scholars James Marsh and Johan Olsen have labeled the “logic of appropriateness.”
The logic of appropriateness is a perspective that sees human action as driven by rules of appropriate or exemplary behavior, organized into institutions. Rules are followed because they are seen as natural, rightful, expected, and legitimate. Actors seek to fulfill the obligations encapsulated in a role, an identity, a membership in a political community or group, and the ethos, practices and expectations of its institutions. Embedded in a social collectivity, they do what they see as appropriate for themselves in a specific type of situation.
In this organizational milieu, “successful compliance is successful performance” (168). This is actually quite rational when faced with issues in the heart of an organization’s “normal” range. It is far easier to create highly standardized results when actors carefully follow a well-defined set of rules and procedures. If you let every member of the organization wing it based on their own view of the world, you will get some curious results to say the least. Where this falls apart is when the issue trends toward the “abnormal” range. We’ve all experienced the maddening frustration of a bureaucrat unable to do the “logical” thing when faced with a slight curveball. Thus, we shouldn’t be surprised when organizations struggle when faced with supremely complex and abnormal situations.
Think of the U.S. military’s struggle at the institutional level to adapt to the insurgency in Iraq. At first, senior officials fought even the suggestion that they were facing an insurgency. They wouldn’t use the word. Once events overtook them, they begrudgingly adapted—far slower than individuals and smaller units did—but it was a fight all the way. Even today, there is a fight over where the new “normal” range should be, highlighting the point that organizations seek one comfort zone and want to stay there.
For a private sector example, read Michael Lewis’ The Big Short to see how blind organizations can be to catastrophe when it doesn’t fit their normal model.
Organizational responses are likely to follow a standard logic of appropriateness, not a specific logic of efficiency
This is where the rubber meets the road. Whether considering one’s own organizational reaction, or trying to divine logic from the reaction of other organizations, we must keep well in mind that the logic is not one of isolated response, but of overall appropriateness. Allison and Zelikow provide some great examples of this from their Cuban missile crisis case. To an outside analyst, the Soviets were so unconcerned with concealing and hardening their missile positions that it would seem that the deployment was meant to be seen and thus was an exercise in strategic signaling. In reality, Soviet strategic leadership believed that the missiles would be better concealed, but the operational units faced with the task of emplacing their weapons did so quite literally by the book. And the book had no instructions for camouflaging missile emplacements. It wasn’t done within the Soviet Union. Furthermore, the layout of the sites and the construction of the nuclear warhead storage bunkers was in strict accordance with their publications, leaving imagery analysts with zero doubt as to what was there. As is noted in the account, the nuclear warhead bunker wouldn’t have passed inspection if it were constructed differently to conceal its purpose: compliance over performance. When discussing another tactical decision (not to announce the presence of tactical nuclear weapons on the island once the crisis began) that seemed at odds with Soviet operational and strategic goals, the commander on the scene later grumbled, “Arcane theories of deterrence mattered less to us than practical questions of assuring our exposed troops the strongest possible armor against attack.”
The United States was far from immune to such disconnects between strategic leaders and tactical actors as its organizations, too, acted according to standing operating procedures and long-standing plans and preferences. Military leaders not only acted with autonomy, but specifically resisted political leaders’ attempts to adjust their responses based on grand strategic concerns. Administration-level decisions such as setting Defense Condition 3 and choosing a quarantine/blockade course of action produced a cascade of effects they could never know the details of as long-standing military plans and policies in the objective area and around the world were triggered by these seemingly simple top-line decisions. For instance, when Secretary of Defense Robert McNamara questioned Chief of Naval Operations Admiral George Anderson on the details of the naval blockade that could precipitate global nuclear war, Anderson told McNamara, “Trust me.” It was all going according to plan, Anderson said, and pointed to a large binder containing the pre-existing operational plan.
Bringing in a more recent example, this dynamic is present even within an organization like the military. Once it was acknowledged that they were fighting a different kind of war in the insurgencies in Afghanistan and Iraq, senior commanders struggled to change their troops’ operating paradigms. They published directive after directive, mandated courses, lectures, and the like. While eventually this produced change in high intensity tactics like close air support and fire support, it took a massive and continuing effort to steer an organization away from business as usual. It took months and even years to put a dent in many commanders’ intentions to fight according to their standing tactics, techniques, and procedures. (On the flip side, the “new” paradigms were embraced and even proven on the battlefield by other commanders who were eager and able to adapt quickly in smaller organizations.)
Likewise, when Kennedy told the Department of State to begin discussions with Turkey about the possible withdrawal of Jupiter missiles as a quid pro quo for Soviet missile withdrawal from Cuba, State instead acted according to a long-standing goal of theirs to move from a specific country nuclear deterrent posture to an overall NATO deterrent posture. In this framework, withdrawal would take years, not days, and thus the specific issue was never broached with the Turks. Kennedy returned to the issue after Khrushchev made a public offering of such a quid pro quo over the radio and was extremely unpleased to find that he had been maneuvered into a corner because his direct wishes had not been faithfully carried out by his organization of diplomacy.
Examples abound in Essence of Decision to show how the attempt to deduce rational motives on the part of the other organization and to predict the result of seemingly simple orders within your own set of organizations are constantly trumped by the complexity, organizational logic, pre-existing planning, and path dependency of decisions and actions. This is a vital lesson for leaders in any context. We live in a world of baffling complexity and while we are obliged to try to understand our environment and act intelligently within it, we have to start with some humble baseline assumptions about just what we can expect to grasp and to do in these conditions. Yet, even if a leader is enlightened about the organizational dynamics laid out above, there are more challenges.
Organizations constrain the scope of investigation and decision
We discussed organizational execution first because that was best suited to illustrate that organizations are built to normalize operations. A close corollary of this is that in the monitoring, deciding, and planning phase, organizations will try to fit situations to their worldviews and capabilities. Organizations see what they are built to look for, often creating at least a partial blindness to outlier events. Furthermore, when large organizations confront complex problems, they tend to divide the problem into smaller parcels that are addressed by specialized subgroups. In this process, organizations have a tendency toward reductionism in framing the problem and toward providing a range of responses that constrains leaders’ choices, often quite narrowly.
The often-criticized war plan in Iraq in 2003 was constrained significantly by path dependency and reductionism as planners worked from a partially formed plan that was on the shelf. Additionally, while the military had standing organizations that excelled at parceling out the kinetic phases of the operation, there was no standing organization or paradigm to flesh out what to do once the conventional battle was won.
Likewise, while the history of the decision-making about intervention in Syria is far from written, it seems that when the President asked what could be done about Syria, the cleanest options could be put together by a single organization: the Department of Defense. When presented with relatively clean briefings on “what we can do tonight” versus the messy process of diplomacy, the President’s choices were relatively constrained.
The private sector is not immune to these dynamics by any stretch of the imagination. In a study cited in Chip and Dan Heath’s book, Decisive, scholar Paul Nutt found that of 168 business decisions studied, only 29 percent of the organizations considered more than one alternative. Over 70 percent were making either-or decisions with millions, if not billions of dollars.
And when making decisions based on these options, leaders are quite often operating with far less information than would be ideal in any case. In another gem of an observation, Allison and Zelikow write, “Those who decide which information the boss shall see rarely see their bosses’ problems.” The stove-piped funnels of information, the narrow focus of specialists and specialist organizations, the need for secrecy, indeed the whole nature and capacity of our hierarchical organizations mean that those supporting the decision-maker rarely understand the totality of the decision or decisions at hand. As a general officer boss of mine once said, “People make bad decisions because they are given bad information.” While sometimes people make bad decisions even with good information, anyone who has observed the staff process knows that truth and complexity rarely float to the top.
The roll-out of the Affordable Care Act is a political hot potato, but no matter how jaundiced one’s view of the President and his candor, an astute observer of organizational behavior will recognize that no one in the bowels of the Administration was willing to truly sound the alarm at the White House that an iceberg was looming in the form of an inadequate website.
In a private sector example, the continued foibles of the financial sector, from the mortgage-backed securities debacle to continued trading scandals, have been enabled in no small part by the complexity of the banks and the complexity of the trades and products involved. Unitary rational actors do not build houses of cards that will wipe their entire balance sheet away, but complex organizations that lack perfect information flows to leaders and perfect incentive structures to agents do.
Once decisions are made, the results sometimes bring the crashing realization that the situation is abnormal and the organizational response is not wholly appropriate. Yet even then, organizational inertia makes adjustment harder than we would imagine. Again, I quote Allison and Zelikow because they state a common problem so well. “Organizations exhibit great reluctance to base actions on estimates of an uncertain future. Thus choice procedures that emphasize short-run feedback are developed. Like house thermostats, organizations rely on relatively prompt corrective action to eliminate deviations between actual and a preset, desired temperature rather than on accurate prediction of next month’s temperature.” Here, I am reminded of the piecemeal, incremental, and in the strategic view, truly inadequate adjustments that characterized America’s escalations in Vietnam, Iraq, and Afghanistan.
These are, perhaps, less than a revelation for most. Yet, we too often forget these limitations when estimating what organizations are capable of doing and understanding. Clearly, we cannot stop using standing organizations to deal with crises. At the same time, we must not think that technology, new management concepts, or radical changes to organizational structure can dramatically remake the nature of human interactions. What we can do is to be more vigilant about the limitations of organizations, attempt to implement controls where possible, and most importantly be far more humble and realistic about our limitations when making policy choices.
In closing, I offer Allison and Zelikow’s list of cautions about organizations. They should be on the wall of every decision-maker. There are myriad controls that could be suggested to at least partially mitigate these concerns and those laid out above. Instead of offering generic prescriptions, I’d ask you to consider how these might affect your organization and how you would account for or mitigate their effects on your operations.
Cautions on organizational limitations
- Organizations are blunt instruments
- Projects that demand that existing organizational units depart from their established programs to perform un-programmed tasks are rarely accomplished in their designed form
- Projects that require coordination of the programs of several organizations are rarely accomplished as designed
- Projects that bring together programs of several organizations will feature an interaction of routines, producing unforeseen and possibly dangerous consequences
- Where an assigned piece of a problem is contrary to existing organizational goals, resistance will be encountered
- Government leaders can expect that each organization will ‘do its part’ in terms of what the organization knows how to do
- Government leaders can expect incomplete, to even distorted, information (from the leaders’ perspective) from each organization about its part of the problem.
Peter J. Munson is responsible for preventive services and global crisis management for a private sector corporation, coming to this position after his retirement from the US Marine Corps in 2013. He is a Middle East specialist with professional proficiency in Arabic. Munson is the author of two books: War, Welfare & Democracy: Rethinking America’s Quest for the End of History and Iraq in Transition: The Legacy of Dictatorship and the Prospects for Democracy.
Image: USMC, John Sullivan