Carnage and Connectivity: How Our Pursuit of Fun Wars Brought the Wars Home


Editor’s Note: This essay is based upon David Betz’s recent book Carnage and Connectivity: Landmarks in the Decline of Conventional Military Power (Hurst & Co/Oxford University Press) as well as a lecture he delivered at the Engelsberg Seminar organized by the Ax:son Johnson Foundation (Sweden).


In 1984, at the mid-point of the Reagan era, Secretary of Defense Caspar Weinberger laid out the rudiments of the Weinberger doctrine on the use of force by the United States in a speech to the National Press Club in Washington, DC. The gist of it:

  • Wars should be fought only when there is a high degree of public support for them;
  • Wars should only be fought in the pursuit of interests that are vital to the nation; and
  • Wars should be fought to win — quickly, decisively, and in a spirit of massive commitment of effort to victory, howsoever defined.

It does not require supernatural abilities to see the grim specter of the Vietnam War animating this set of foreign policy principles. The American defense establishment’s memory of its hellish experience in southeast Asia still gnawed its consciousness at that time — even while, by the mid-1980s, it was beginning to recover its sense of self-belief. A passage from the memoirs of Gen. Tommy Franks, American Soldier, that describes the army of the 1970s, serves well to illustrate the nadir of the mood to which they wished never to return:

Like the nation it served, the army had begun to doubt itself [in Vietnam]. We had fought a long, costly war to a stalemate and then had withdrawn, claiming “peace with honour.” Pessimism and negativism extended from the Pentagon down to the infantry rifle squads and artillery gun crews.

It is hard to dismiss the pertinence of the doctrine, therefore, which indeed ultimately acquired a sort of orthodoxy in a later variant, the Weinberger/Powell doctrine championed by Colin Powell in his days as chairman of the Joint Chiefs of Staff in the 1990s. Basically, Weinberger had a good point: Armies are the nation’s ultimate insurance policy and their strength is not best frittered away in protracted, thankless, invertebrate conflicts. Nonetheless, there were critics both outside the establishment and within it. For the former, Edward Luttwak’s critique is exemplary:

It’s like a hospital that does not want to admit patients. Some hospital administrators want the perfect state of maximum readiness, and patients make a mess.

Reagan’s Secretary of State George Shultz, at war with Weinberger for the foreign policy soul of the administration, provided the acid phrase which subtitles this essay. Weinberger only wanted to fight the “fun wars,” ignoring the fact that America’s global interests were extensive and varied enough to necessitate the use of limited force in all sorts of contingencies that would never pass the test above. At the root was the question, in the words of William Safire writing in the New York Times: “Why are we wasting a hundred billion dollars a year on force we will never apply? Without the will, of what use is the muscle?”

Again, it is hard to dismiss the salience of the question. Much as soldiers may prefer (with good reason) to be treated like one of those alarm levers in public buildings encased under glass labelled “break only in case of emergency,” policymakers have multiple objectives and are obliged to ask such value-for-money questions about the instruments they have to hand. Shultz was by no means blind to the lessons of Vietnam. His point, rather, was that:

Power and diplomacy always go together. … Certainly power must always be guided by purpose, but the hard reality is that diplomacy not backed by strength is ineffectual. … Power and diplomacy are not distinctive alternatives. They must go together or we will accomplish very little in the world.

The tension between these two poles was not resolved in Reagan’s day, nor in the time of his successor George H.W. Bush. Somewhat famously, it was reprised in the first Clinton administration when Secretary of State Madeleine Albright demanded in an altercation with Powell, just as the Bosnian War was heating up in 1993, “What’s the point of having this superb military you’re always talking about if we can’t use it?” In his memoirs, Powell noted that the remark nearly caused him an aneurysm.

In the run-up to his first election, George W. Bush made it clear that his administration would pursue a distinctively Weinbergerian approach. As his national security advisor-to-be Condoleezza Rice put it in a widely cited Foreign Affairs article “Protecting the National Interest”:

The president must remember that the military is a special instrument and it is meant to be. It is not a civilian peace force. It is not a political referee. And it is most certainly not designed to build a civilian society.

Nearly a decade and a half into the War on Terror, howsoever named (the Obama administration has abjured the label, while carrying on with it in substance), the irony of Rice’s words will escape precisely zero readers. The gap between war as it generally is in reality (slow, crude, and politically imperfect) and war as we wish it to be (fast, precise, and politically decisive) has never been larger.

The West, more specifically the United States with its major allies alongside it, has been chasing the “fun wars” for 40 years and coming up empty. Recently at War on the Rocks, Sebastian Bae argued that America’s restless pursuit of the easy wars from the Gulf War to today has led it serially to quagmire and defeat. I would add a bit of depth to this excellent point — “fun wars” theory goes back to the dog days of the Cold War, if not further. Moreover, it is not just that its pursuit leads us to defeat abroad — protracted, thankless, invertebrate war is following us back home, trickling back onto our own streets.

The story of how this happened and why is not a happy one, but it is full of ingenious twists and Olympian efforts and, therefore, interesting and educational. Probably, I think, it is a tragedy, but the conclusion has not yet been written. Perhaps, to mix the metaphor, the rabbit will be drawn from someone’s hat somehow, though it is doubtful. The hunt for wars that are fast, easy, and decisive has been run with great industry and technological inventiveness, to which the advent of the “information age” has contributed substantially. Over and over, we have substituted gadgets for strategy. All that it has delivered so far is slow, bitter, and indecisive quagmires.

Under normal historical circumstances, a simple definition of “fun war” would have served to illustrate its quixotic incompatibility with reality. A fun war is a war against an opponent, the contours of whose face precisely matches the dimensions of the knuckles of one’s fist and who obligingly presents this target fixedly at the distance of precisely one arm’s length. The existence of such an opponent defies common sense: By and large, people are not so stupid, particularly in war, which is a font of human ingenuity without parallel. It defies our most vital theory of war, as laid out by Clausewitz, that war is a duel of creative wills and not a contest with some mute and unreactive entity. And it defies the paradoxical logic of strategy, which says that the route to victory is doing exactly the opposite of what is expected.

But the gods like to play on mortal folly. Thus at the end of the Cold War, a giddily unexpected triumph of liberal democracy heralding, in Francis Fukuyama’s phrase, the “end of history,” they drew the first veil across the eyes of the defense establishment in the form of Saddam Hussein, who offered up the ultimate fun war. First, he invaded Kuwait in August 1990 and declared its outright annexation, thus violating the one true no-no of post-Second World War international order. The noted international lawyer, Sir Hersch Lauterpacht, once wrote that “if international law is, in some ways, at the vanishing point of law, the law of war is, perhaps even more conspicuously, at the vanishing point of international law.” On the matter of the prohibition against “use of force against the territorial integrity or political independence of any State,” however, the United Nations Charter Article 2(3)(4) is perfectly clear. The assembling of an international coalition was relatively easy, then, as was the establishment in the public consciousness of an unambiguously legal and just casus belli, as both rested firmly on an ironclad United Nations Security Council resolution authorizing the use of military force. Second, Saddam parked two-thirds of his largely Soviet-equipped army out in the open featureless desert to await the hammer blow of a vastly superior force that had just spent three generations training and equipping itself for exactly this sort of engagement — a recipe for a truly splendid little war.

Notwithstanding the pre-combat preparation of the American public by the Bush administration for potentially thousands of combat losses, victory for the coalition was never seriously in doubt, which is what led one French critic, the philosopher Jean Baudrillard, to declare that the war did not take place. But when the hammer did fall in the form of a 40-day campaign of aerial bombing, followed by a 100-hour ground campaign, the lopsidedness of the rout of the Iraqi military at the cost of a historically minuscule number of coalition casualties was surprising. The lesson taken by many was that the key to victory had been high technology, specifically information technology. In a nutshell: Knowledge is power; the side with the better microprocessors could best the side with the bigger battalions; friction, the “fog of war,” the fundamental quality of chance in war — all that could be erased. The popular press greeted this development with bounteous relish. As Marc Cerasini wrote in his book, The Future of War:

The world was awed and amazed when the first images shot by precision-guided munitions, or the aircraft that fired them, were broadcast during a post-strike military briefing the morning after the Gulf War began. No newspaper, no magazine article, no Discovery Channel special had given Americans a clue as to how effective surgical strikes using precision-guided missiles and bombs would be. Pictures of missiles striking parked aircraft, racing between buildings to obliterate a military target, or flying through the open window of an Iraqi government building alerted the population of the world to the revolution in warfare brought about by precision-guided munitions. The future had arrived.

The pride in the American military’s battlefield accomplishments (and relief that a bloody quagmire had been avoided) was palpable. “By God, we’ve kicked the Vietnam syndrome once and for all,” noted the president in an apparently euphoric statement at the war’s end. In the wake of the Gulf War, the American military convinced itself that, even if the Gulf War had not itself been a revolution in warfare, it clearly pointed the way towards one. It became the key-shaping event of American defense policy throughout the 1990s, changing the course of American military thought.

From the perspective of 2015, this seems ridiculous. Even in the immediate aftermath of the Gulf War, some found it ridiculous, or at any rate implausible. The putative “revolution in military affairs” was, self-evidently, wishful thinking. Actually, information technology proved massively empowering of traditionally weak non-state actors, much to the expense of traditionally strong state actors, for whom the gains have been decidedly more equivocal. Notwithstanding this, it is impossible to deny the idea’s powerful grip on the imaginations of the world’s major armies. Adm. Bill Owens, in his book Lifting the Fog of War, put the case for the revolution most emphatically:

The computer revolution, if correctly applied, presents us with a unique opportunity to transform the US military into a lethal, effective and efficient armed force that will serve the United States in the 21st century. This is the American Revolution in Military Affairs. This new revolution challenges the hoary dictums about the fog and friction of war and all the tactics, operational concepts and doctrines pertaining to them.

Meanwhile, though, through the period between the Gulf War of 1991 and the post-September 11 invasions of Afghanistan and Iraq in 2001 and 2003, respectively, fun wars theory developed another layer of seductive promise. Not only would high technology “lift the fog of war,” it would also, through stand-off precision weapons, dramatically reduce the need for “mass” in the form of ground forces. This line of reasoning overlapped with then-fashionable thinking about “de-massified,” post-industrial “knowledge economies” and their supposed military equivalents.

However, the thing being substituted for here was not so much “boots on the ground” as it was will. The problem for policymakers was that they perceived from their own electorates the need to be seen to “do something” about tragic events abroad, which were allegedly beginning to impinge more on public consciousness due to the increasing ubiquity of global media. But their actual ability to “do something” useful in such conflicts was really rather limited — in part because no one had much appetite for casualties or great material expense. The clamour nowadays to respond in some ill-defined way to the barbarous cruelty and aggressive chauvinism of the Islamic State in the Middle East reflects exactly the same pusillanimous combination of contradictory impulses.

The trick, then, was to do something sufficient to mollify public conscience, but at the least possible cost. The result — “post-heroic” warfare — may have been cynical, even low and dishonest, but it had a distinctively defensible strategic logic. The West, by and large, was content with achieving partial results, because it was not directly threatened by these conflicts in which it intervened. Hence the determination to obtain significantly different outcomes in them was proportionately mushy. It became accustomed to the idea that “waging modern war” was essentially strategically ambiguous in that it did not end in victory as the combatants would recognize. The Kosovo War of 1999, launched in order to forestall the ethnic cleansing of predominantly ethnically Albanian Kosovo by the Serbian forces of rump-Yugoslavia, was something of a high point of post-heroism as NATO fought it entirely from the air. In the words of Gen. Wesley Clark, who commanded NATO forces during the Kosovo War:

Though NATO had succeeded in its first armed conflict, it didn’t feel like a victory. There were no parades, except by the joyful Albanians returning to Kosovo. The military and the diplomats within Nato were simply relieved that the operation was concluded and they were absorbed in the next mission, working on the ground inside Kosovo.

Post-heroic warfare was ill suited, however, to the post-9/11 world, even though the initial results of “major combat operations” in the Afghan and Iraq campaigns were highly gratifying. Military analyst Max Boot in the pages of Foreign Affairs magazine hailed the advent of a “new American way of war” a little too soon.

The story of the myriad subsequent failings and ultimate demise of the wars in both places need not be covered in detail here. As Tim Bird and Alex Marshall concluded in their book Afghanistan: How the West Lost its Way, there have been two consistent themes in the Afghan campaign: First, the “mismatch between the dominant policy fashions pursued at particular points in time and the cycle of events in Afghanistan itself”; and second, the flawed execution of those policies “suggest[ed] that, even if there had been a closer alignment of approach with conditions on the ground, ‘success’ would have been elusive.” Meanwhile, the counterinsurgency debate will go on and on reproducing the same ideas and arguments, as it has in every generation for 100 years already.

To my mind, a more fundamental failing came in the form of a second veil, which was the conceit that what have come to be called “wars amongst the people” could be fought without the engagement of the passions of one’s own domestic population. Since 9/11, there can hardly be a single Western leader who has not uttered the phrase “war of ideas” to describe the context and stakes of the War on Terror. There has been a flood of new doctrines on “strategic communications,” “influence operations,” “information warfare,” “maneuver in the cognitive domain,” etc. (No good propagandist would use the word “propaganda,” though that would be the more economical and correct term here.) But virtually none of this is focused upon the key population — the population whose beliefs and ideals bear the most direct influence on the sustainability of any campaign — the population at home.

We are enjoined to “counter the extremist narrative,” but what is the West’s narrative? What is the West’s conception of itself? The strategically debilitating thing is that it does not have one, or is incapable at the present time of articulating one. Traditionally, strategic narrative, or propaganda, rests on a combination of myth, rhetoric, and symbolism. Together, these are employed to achieve a “common will” as Walter Lippmann called it, an “oversoul … a national mind, a spirit of the age which imposes order upon random opinion.” The West has no such thing. What prevails instead is a multitude of narratives, variously contradictory, incomplete, or unconvincingly rendered.

Dozens of countries have contributed to the U.S.-led mission in Afghanistan and every single one of them has been dogged by the same question. Why? For all but one of them, probably the biggest reason they were there was because the core of their national security policy is to act as deputies to the United States when it is in global sheriff mode. And yet former Secretary of Defense Robert Gates recorded in his memoirs this startling realization during a March 2010 meeting on Afghan strategy:

As I sat there I thought: “The President doesn’t trust his commander, can’t stand Karzai, doesn’t believe in his own strategy, and doesn’t consider the war to be his.”

Voters may be stupid but they are not so stupid as to miss the gap between soaring rhetoric about necessary wars — wars that can and must be won — and the actual demands placed upon them as citizens in a country at war. Ask no sacrifice of the people and they will have no major stake in victory; on the upside, they will have low or no expectations of your leadership. Do not raise their taxes or conscript their sons, or even call for volunteers. On the contrary, enjoin them to be at ease, to continue in their daily habits and to shop as normal. In a less densely connected age than ours, this might have made sense. Now it does not. It is a recipe for losing.

To my mind, though, it is harder to find a more poignant critique of the period approximating from the Gulf War to today than this extract from the memoirs of Maj. Gen. John Cantwell, an Australian officer with 38 years of service, encompassing three wars from Operation Desert Storm in 1991 through Iraq in 2006 and Afghanistan in 2010, where he headed the Australian contingent. He recounted his struggle with post-traumatic stress disorder. He was troubled particularly by a gnawing doubt:

As I paid a final salute at the foot of yet another flag-draped coffin loaded into the belly of an aircraft bound for Australia, I found myself questioning if the pain and suffering of our soldiers and their families were worth it. I wondered if the deaths of any of those fallen soldiers made any difference. I recoiled from such thoughts, which seemed disrespectful, almost treasonous. I had to answer in the affirmative, or risk exposing all my endeavors as fraudulent. I had to believe it was worth it. But the question continues to prick at my mind. I don’t have an answer.

When your most senior commanders cannot construct an answer to the question “why?” that sufficiently assuages their own private consciences, there is something quite wrong with your way of war. Obviously, the fun war is a chimera. It would be facile to say who was more wrong — Weinberger or Shultz. The answer is neither and both. The more useful truth is that Clausewitz was right when he said war comprised a trinity of chance, passion, and reason. The West contrived, with technology, to eliminate chance and to replace passion and failed on both counts, but it has succeeded in blasting reason to smithereens.

In London in late June 2015, in the midst of a Mediterranean-style heat wave, 1,000 armed officers of the Metropolitan police, backed by special forces of the British Army, practiced their response to a multiple active shooter attackà la Mumbai, Nairobi, Paris, and so many other places — in a disused tube station on the Strand, beneath the university where I work. That day as I edited the book upon which this essay is based, the sounds of automatic gunfire, the battle cries of the combatants, and the screams of the mock-wounded could be heard coming through my open window. Helicopters clattered overhead and I watched police snipers bounding across the roof of Somerset House. It was not at all for fun. The eventuality for which these preparations are occurring is painfully, self-evidently plausible and everyone knows it — the more recent attacks in Paris showed this to be the case and they will not be the last.

The tragedy of the restless pursuit of fun wars is that it has inevitably taken us back home to our own streets, our own schools, hospitals, and public spaces.


David Betz is Reader in Warfare in the War Studies Department at King’s College London. His most recent book Carnage and Connectivity: Landmarks in the Decline of Conventional Military Power (2015) is published by Oxford University Press in the USA and Hurst in the UK.


Photo credit: Staff Sgt. Aaron Allmon, U.S. Air Force