The Obstacles on the Road to Better Analytical Wargaming

COMPTON II

Now that Bob Work has left the building — well, two years ago — what is the future of wargaming as an analytical method in the Department of Defense? That’s not an easy question to answer. The last few years have seen both steps forward and backward as supply sought to meet demand. But now demand may be tapering off, as traditional biases for more mathematically based methods begin to reemerge and attempt to reclaim the mantle of analytical validity, further complicated by the uninformed fascination with machine learning and “big data” as an alternative to gaming.

As an advocate for, and practitioner of, analytical wargaming within the Department of Defense, I’ve witnessed some good things emerge over the past few years, but also enough poor practices to reinforce, to no small extent, the criticisms of gaming made within the operations research community. Peter Perla’s 2016 call to improve wargaming, while widely read and commented on at the time, was mostly ignored by wargame practitioners among federally funded research and development centers, universities, and defense contractors, who, frankly, seem largely content to continue on with business as usual.

 

 

It would be easy to dole out a list of examples here, but some time spent perusing the Defense Department’s Wargame Repository (if you have access) will tell the story. With million-dollar wargames putting out such profound insights as “Cyber will be very important in future warfights,” it becomes hard to justify a method that is often costly, yet generates such paltry and obvious results.

Perla’s description of the BOGSAT (“bunch of guys sitting around a table”) masquerading as analytical exercise hits the nail on the head. Attendees of such games often come away wildly enthusiastic about the success of the wargame in which they often “learned a great deal.” Yet when pressed they are hard-put to provide a single insight that most of the participants didn’t already know when they started. This is made worse by wargame results that are often all too easily falsified when underlying assumptions are subjected to more rigorous analytical scrutiny. Chris Weuve’s monograph on Wargame Pathologies is demonstrated often enough in practice, but if we’re honest with ourselves it’s not hard to understand how going into an intellectually stimulating environment for an extended period of time with people of similar interests would be very gratifying personally. But that’s not the same thing as a successful wargame.

In my experience, frustrations with this trend were foremost when Work, then deputy secretary of defense, tried to realize a more robust wargaming enterprise. The real challenge was to create a wargaming paradigm that would be accepted by various analytical organizations. Wargame providers, however, were content to dismiss analytical concerns and requirements, in one case going so far as to bluntly state that despite client concerns, wargames were theirs to execute as they saw fit. They were, after all, the professionals, and they would not be told how to do their work. Unfortunately, the work of these professionals did not meet informed client needs.

As a result of the direct failure of the wargame community to adequately address analytical concerns, the department created wargame teams capable of designing and executing wargames that would be fully integrated with more traditional analytical methods. It had to be done on tight timelines and it also required that every game be custom-designed specifically to address level-of-analysis concerns. The games had to be rigidly adjudicated, conducted with as few external participants as possible, bringing in subject-matter experts where needed only. Designers were brought in from the commercial board-game world, along with subject-matter experts, research methodologists, and analysts. Over the course of a six-month study there could be as many as five different rigidly adjudicated game systems created, with execution consisting of as many as 20 to 30 games, all while integrating with traditional analysts who would run or create models and simulations to validate or falsify game assumptions. The efforts often spanned levels of analysis from the tactical all the way to the geostrategic within one study. The results of these studies had direct and significant impact on several high-profile defense programs, to include net assessments, operational plans, and significant acquisition programs. The games were analytically credible by virtue of the fact that multistage efforts created end-to-end logical narratives for why the results were what they were, and why it mattered. And, in some instances, some genuine innovation emerged.

It is significant that much of this effort had to do with the concept of analytical ownership. Traditional wargame practitioners provided a service, not an analytical product. This is to say that the traditional approach of getting a bunch of subject-matter experts into a room, dividing them into teams, and facilitating a red-blue interaction with the end result being a report back on what happened was not acceptable. What’s more, it was not a valid method with which to get at any complex issue in any analytical detail. Instead, the department sought a process of analytical ownership in which there was a design of research that incorporated wargames along with other methods, and with a final product that described in narrative detail why the effort calibrated around certain theories of success.

Several agencies within the Defense Department, particularly within the Office of the Secretary of Defense and the combatant commands have now seen the effectiveness and impact of a complete analytical process that incorporates wargames and are now beginning to consider how they might do the same thing. The notable exception to this interest has been among more traditional wargame practitioners in the wargame community. To date not a single federally funded research and development center, contractor, or educational institution that purports to provide a wargame service has shown the slightest interest in providing a complete analytical solution that incorporates wargames, nor have they shown interest in analytical ownership of the outcome. To my knowledge, none has even been interested enough to ask what the requirements are. Apparently there is enough demand elsewhere to keep the wargame community busy, and if the reports I’ve read generated from recent wargames are any indication, the BOGSAT is alive and well.

And with new leadership comes new priorities. I can’t predict what any of that will mean for the wargame community at large, but the desire for more analytically robust wargames is certainly present and consumers at the Defense Department are now aware that better is possible. The question remains whether the community at large will step up to address that desire, or continue along the current path of least resistance, providing visceral experiences devoid of rigor or analytical depth.

It is not my intent to paint the entire community with one brush. But frustration with the professional wargame community of practice is real and growing among many of us in the department. Substantial investments have been made in facilities and institutions to improve wargaming as an analytical tool, but many feel this investment has been wasted on practitioners too wedded to what they’ve always done in the past to make any real improvements to the process. Witness the great concern in the wargame community over how the next generation of wargamers will be trained while avoiding any substantive discussion or investigation about what analytical requirements wargame consumers actually need met. Concerns over the community of practice’s future might be best addressed by finding out what the consumers actually want out of wargames and what has been done along those lines already within the analytical departments of the Department of Defense.

 

 

Dr. Jon Compton is a senior analyst and wargame subject-matter expert in the Office of the Secretary of Defense. He holds a doctorate in formal research methods and world politics. The views expressed in this article are those of the author and do not reflect the official policy or position of the Department of Defense or the U.S. government. The appearance of external hyperlinks does not constitute endorsement by the U.S. Department of Defense of the linked websites, or the information, products, or services contained therein. The Defense Department does not exercise any editorial, security, or other control over the information you may access at these locations.

Image: U.S. Navy (Photo by Cmdr. Gary Ross)