When Superiority Goes Wrong: Science Fiction and Offset Strategies

3112924227_00b288dcd1_o

For all our talk about the need for military technical superiority, what if pursuit of that goal becomes our downfall? A couple of weeks ago, Bill Sweetman from Aviation Weekly and I were talking about technology development issues and the Pentagon’s new offset strategy, the Defense Innovation Initiative. During the conversation he mentioned an Arthur C. Clarke story, Superiority, from 1951 that reminded him of some of our current challenges. Being the nerd that I am, I read it that evening with high hopes.

The story is great: It’s short and you should all read it. I’ll unpack it in a moment, but I’d like to pause for a second and consider the role of science fiction in military technology thinking. Why would I automatically assume that a science fiction short story from 63 years ago would be useful today?

Most defense nerds love science fiction. Peter Singer famously explored this relationship in Wired for War — drawing out the relationship between the technologists behind unmanned weapons system development and the science fiction that inspired them. The New America Foundation recently hosted a daylong conference, headlined by Neal Stephenson, that sought to re-focus science fiction on providing inspiration to today’s scientists and engineers. August Cole, of the Atlantic Council’s Art of Future Warfare project, and Peter Singer will publish Ghost Fleet: A Novel of the Next World War next year. And, well, this is how Lt Col Dan Ward writes his books.

We enjoy being amazed by the prescience of authors, from Robert Heinlein’s networked, highly mobile and lethal mechanized infantry in Starship Troopers, to William Gibson’s conception of ‘cyberspace’ in Neuromancer. However, for this defense nerd, technology prediction is merely a by-product of the true genius and value of good science fiction: human insights that teach us about our actions and ourselves.

This perspective on science fiction is best articulated by Ursula Le Guin in the foreword to The Left Hand of Darkness, and by Neil Gaiman in the introduction to the 60th Anniversary Edition of Ray Bradbury’s Fahrenheit 451.

Clarke’s short story, Superiority, does not predict technologies that we recognize today, but elegantly describes a number of disturbingly familiar military technical failure modes. Such insights are especially helpful when thinking about new endeavors like the Pentagon’s Defense Innovation Initiative, which will include both a new long-range research and development planning program and an offset strategy.

In a distant future, an unnamed dominant military power has been engaged in a lengthy space war with a technically inferior adversary. The dominant force appoints a new “Professor General.” This new leader changes the dominant power’s technology strategy from upgrading existing systems incrementally to developing and deploying new weapons, believing that “a revolution in warfare may soon be upon us.” This change in strategy sets off a series of disastrous events that ultimately leads to the dominant military power’s defeat.

Here’s how the decline unfolds. The superior force abandons the production of old weapons platforms to focus on the development of a new “irresistible weapon.” The weapon takes longer to develop than planned and can only be launched in limited quantities. During the development period, the adversary is able to build larger numbers of their inferior weapons so that even when the new weapon works as planned, it does not provide the anticipated advantage. The superior force then attempts a large-scale effort at battle management automation only to have the enemy rapidly adapt to their new concept of operations, targeting central nodes in their new order of battle to devastating effect. In response, the previously superior force develops a final new weapon only to have significant integration issues that throw their forces into disarray, precipitating their defeat within a month.

The story hits home for all of us familiar with the challenges of developing new military technologies and capabilities. Many of these issues can be seen to lesser degrees in recent air, sea, and land weapons system developments. Clarke’s story serves as a powerful reminder that these issues should not be attributed to technology, the cunning of our adversaries, or macro trends in technology diffusion. These are ultimately human failings.

The United States relies on technical superiority to maintain its military advantage. But this technical superiority requires humans to generate the right strategies, design and build the right technologies, devise concepts of operations, and train forces to operate the technology to achieve strategic and tactical objectives. Sometimes this requires new “leap ahead” technology and sometimes it does not. Given that human judgment is required, and that all humans are fallible, we cannot hope to be right 100 percent of the time. However, we cannot ever let the hubris evident in Superiority lead us to defeat due to, as the narrator assesses, “…the inferior science of our enemies.”

 

Ben FitzGerald is the director of the Technology and National Security Program at the Center for a New American Security. He co-directs the “Beyond Offset” initiative at CNAS.

 

Photo credit: Jorel Pi