Technological Prowess, Lethality, and the Civil-Military Divide
From Gen. George Washington’s 1783 address at Newburgh to Morris Janowitz’s The Professional Soldier and contemporary discussions about past and present administrations’ relationships with generals, there is an enduring debate about the nature and tone of the civil-military divide in the United States. Today, discussions about this divide tend to focus on — among other factors — changing demographics (including a geographical divide between the active duty military and society at large); the diminishing number of policymakers who have themselves served; and the growing “family tradition” in which today’s armed forces are composed of those whose family members have also served.
These are worthy and necessary considerations. However, a proper assessment of the civil-military divide requires an analysis not only of military personnel and the society they defend, but also of the changing character of war and the tools used to wage it. Advanced technology and innovation save lives in combat, but have unintended consequences for the military’s relationship with civilians at home. These technological shifts have lowered the human costs of warfare, shielded many military operations from public view, and contributed to the sense that the military is a “warrior caste” separate from the rest of society. Today, genuine civilian control of the military requires empowering civilians with the knowledge and tools to uphold that responsibility. Fundamentally, the solution to civil-military divide will be found in the American education system and in heightened professional engagement between the military and society at large.
A Changed Approach to Warfighting
Since the end of World War II, the United States has relied on technological superiority — rooted in historical efforts to “offset” the Soviet Union — to close real and perceived capability gaps. Reacting to the Korean War and the fear that the Soviet Union would entangle the United States in small wars, President Dwight D. Eisenhower’s New Look Strategy of 1954–1955 resulted in the first offset. This was an investment in strategic deterrence through massive retaliation, and later, flexible response. Beginning in the 1970s, the second offset capitalized on technological innovations and their application to military technology. The result was stealth technology, precision-guided munitions, microprocessor-based command and control systems, and digitally based intelligence, surveillance and reconnaissance (ISR) capabilities. These technologies gave the United States a clear, unchallenged military superiority in the late 20th and early 21st centuries.
The 9/11 attacks thrust the realities of aggressive military intervention back into the public consciousness, while, more generally, increasingly globalized threats forced dramatic shifts in America’s approach to warfare. These advances have allowed — and compensated for — a lighter troop “footprint,” while radically increasing operational effectiveness. But while many of the same technologies that contribute to American military dominance have muted risks to military personnel deployed overseas, they have also enabled American society to ignore or be ignorant of the constant low-intensity conflict in which the military is engaged.
The current approach to warfighting has removed most Americans and policymakers from the intimate realities of war in three ways. First, it has allowed for smaller military engagements with fewer losses of personnel, reducing the number of American citizens who are impacted by wartime casualties (as Sarah Kreps recently assessed, limited wars have also contributed to the declining relevance of wartime taxes). Second, technology has increased the military’s reliance on drones and special operations forces, masking many military operations from public view. Third, these factors have fostered a culture of a highly specialized “warrior caste,” which is unique in its specialized capabilities but distant from the society it defends.
Increasing Capability, Decreasing Casualties and Accountability
Today’s U.S. military is much smaller than the Cold War force. But it is also utilized more — a product both of shifting threats and of policymakers’ increasing preference for using the military as the first option of diplomacy rather than the last resort. The cumulative effect of higher utilization and a smaller force is a much higher dependency on complex technology to increase lethality and decrease combat losses.
The professionalized U.S. military is also far more refined than ever before. Consider, for example, what is needed to destroy a target in a 600 ft. by 1000 ft. area. According to a National Defense University report, during World War II, destruction of this target would have taken 9,000 2,000 lb. bombs and 1,500 B-17 missions. By the Vietnam War, destruction of the target would have required 176 2,000 lb. bombs, and 88 F-4 missions. In the Gulf War, this required perhaps two precision-guided munitions from a single F-117. Today, this attack can be conducted with a single Joint Direct Attack Munition from an unmanned platform. The cost of conducting the attack in human and combat-loss material costs has drastically decreased, while the cost of the operator — in time and money — and the complexity of the technology has drastically increased.
One consequence of this shift is the expectation that the United States can perform feats of operational prowess with fewer personnel. For example, although the 1991 and the 2003 invasions and subsequent defeat of Iraqi forces were identical in duration (one month, one week, and four days, excluding subsequent peacekeeping efforts), the 1991 invasion involved upwards of 700,000 U.S. troops, while the 2003 invasion involved 125,000 U.S. troops.
Together, fewer battlefield personnel, higher lethality, and more effective battlefield medical care have resulted in significantly fewer combat casualties compared to previous conflicts. This creates a situation in which most of the general population watches wars on CNN without being impacted in terms of lost spouses, siblings, or children. The steady decline of anti-war protests suggests a decrease in the healthy skepticism and government accountability regarding the use of American military force. This means less pressure on Congress and other elected officials to end conflicts or to avoid entering them in the first place. Policymakers, as a result, can support (via appropriations) pervasive low-intensity conflict without significant public scrutiny.
Shifting Policy Costs
The military has become more averse to casualties today than in previous eras of conflict, in part a result of advanced capabilities that allow war to be waged while putting relatively few human lives at risk. Improved technological capabilities and the asymmetric nature of post-9/11 wars have placed an increased emphasis on over-the-horizon weapons and non-traditional forces, particularly special operators. Dispersed and asymmetric threats have required more agile responses and “real-time” intelligence, placing a premium on special operations and unmanned ISR. A 2016 CNAS report by Jacquelyn Schneider and Julia Macdonald, which was supported by multiple polls, found that Americans generally preferred unmanned platforms that lower risks to personnel. Today, thanks to the gradual acceptance of drone warfare over successive administrations, policymakers are presented with a perceived low- or no-cost decision to use force and most Americans are broadly disengaged because there is no cost to them personally. The public’s support for a technological approach to warfare may be a driver of the division between civilians and the military, each of whom sees that technology as a way to protect the other.
Not only does the technological evolution of the military mean that war personally affects fewer Americans, it also means that the changing character of war itself is often out of the public’s grasp. Capability specialization and warfighting innovation have created a force that is both complex and often incomprehensible to the average citizen. Today’s “high speed,” networked approach to counter-terrorism and counter-insurgency has seen notable success on the battlefield, while also leaving the full breadth and depth of technological capability and global military operations beyond the reach of most Americans. There is an inherent attitude of deference — the feeling that the military must be trusted to make the right decisions, because America’s engagements are too widespread and the battlefield too complex to be fully understood.
Necessarily, the post-1973 volunteer force is composed of Americans who want to serve. It has become logical, if also convenient, to view this volunteer force as exceedingly capable, as an elite and distinct “warrior caste.” The military itself has embraced this image — both the U.S. Army and Air Force creeds deem each soldier and airman a “warrior” by the second line. These creeds, both published in the post-9/11 era, contrast sharply with the more traditional Code of Conduct for Members of the Armed Forces, in which service members are instead identified as “an American, fighting in the forces which guard my country and our way of life.” In other words, military personnel traditionally held themselves to high standards not because they were of a certain “warrior caste,” but because they were Americans — products of a civilian society dependent on the willingness of some to sacrifice on behalf of all. Today, this norm seems to be eroding. Add to this the fact that a majority of military personnel are related to someone who has served, and it becomes clear why the armed forces are increasingly a class apart from the rest of America.
Repairing the Civil-Military Dialogue
Narrowing these divisions requires, to borrow military terminology, both strategic and tactical-level objectives. The American citizenry must recognize the collective responsibility of oversight as one of the most sacred acts of patriotism. This oversight starts with a solid education in civic responsibility and the events that impact U.S. national security. Our recommendations focus on increasing public exposure to national security events, policies, and practice.
There has been a catastrophic lack of attention paid to U.S. national security history in the primary education system. Since the 1980s, classrooms have struggled to sufficiently incorporate America’s more recent wars into their curriculum. Reasons have ranged from the controversial nature of Vietnam to cash-strapped budgets that have prevented educators from purchasing textbooks covering Iraq and Afghanistan. Increasingly, textbooks reference the fact but not the content of Vietnam War protests. And last year, researchers led by Amy Zegart at Stanford University found that only one Advanced Placement U.S. government exam in the past decade asked a question about foreign policy and national security.
Systemic obstacles to education cannot be ignored, but where feasible, high school and university educators should incorporate discussions on warfare, advocacy and congressional oversight in the Vietnam and post-9/11 era. Defense Secretary Jim Mattis has spoken extensively on the importance of history to military leadership. “Thanks to my reading,” he once wrote, “I have never been caught flat-footed by any situation, never at a loss for how any problem has been addressed (successfully or unsuccessfully) before.” This approach is no less critical for an accountable and educated citizenry. Citizens cannot be expected to hold institutions and leaders accountable for issues on which they have not been properly educated.
Even at America’s most elite institutions of learning, there is work to be done. In an international security course that one of us attended at Yale last spring, approximately 40 undergraduates were asked what event led to America’s military involvement in Afghanistan. The most common answers were: to unseat Saddam Hussein; to dismantle Afghanistan’s weapons of mass destruction; and “oil.” Eventually, the only student who could correctly name the 9/11 attacks as the impetus was a student born and raised in another country.
The military also shares in this obligation. University ROTC programs should encourage civilian enrollment in military history, national security, and ethics courses. As elite universities, such as Yale, Harvard and Stanford re-integrate ROTC into campus culture, campus leaders and military instructors ought to recognize civilian participation as an opportunity for cadets and midshipmen to better understand the public they serve, and for civilians to better know the principles and history guiding the military that represents them.
Similarly, service academies and war colleges have been responsible for creating the professionalized military force. Service academies, especially, are an intersection of civilian and military cultures. Last fall, Air Force Academy cadets in one of our classes were asked what made them different from their peers at civilian institutions. The cadets’ simple answer: “We are here.” The cadets viewed themselves as members of the warrior caste, but also as part of society at large. Their transition to being military officers would be incomplete had they not also received an extensive education on civilian oversight and control of the military. Such lessons are critical to developing an officer corps respectful of civilians and their oversight role. Service academies and war colleges should continue to engage civilian institutions and leaders on this front through conferences and fellowships. Institutions of higher learning, both military and civilian, are improved when resources are combined, and ideas and differences debated.
Beyond education, there are other ways to create a society that is sufficiently exposed to the military, beyond the traditional suggestions that are usually advanced. The reinstatement of a draft is unlikely for political and practical reasons, some of which Max Margulies recently outlined. But over the past several years, the Defense Department has considered so-called “lateral entry” recruitment as a way to fill critical billets within the services. The lateral entry concept — a product of Ash Carter’s tenure as secretary of defense — sought a pathway to open the ranks for highly qualified experts in the civilian space to directly commission into one of the services at a grade up to O-6. Lateral entry was proposed as a solution to critical shortages in cyber-focused career fields and had a long — and not inconsequential — list of potential downsides. However, a critically overlooked positive aspect of the concept is its potential to greatly increase civil-military dialogue and help a broader swath of civilian society engage with the complexity of the modern professional military. Lateral entry is not a panacea and there are serious barriers to its implementation (military culture is one of them), but it may provide another policy tool to re-engage civil society.
Technology and innovation are essential components of the American way of life, and important to achieving battlefield objectives, including limiting casualties. Increasingly, they are also the offset to a progressively smaller, leaner, exceedingly complex military rather than to a specific external threat. At the same time, technological advancements may be creating barriers to healthy civil-military relations. This is a paradox, and one not easily solved. But one place to start is by better educating the general public about military matters. This is, to be sure a daunting task and may take generations to come to fruition. But, combined with specific and creative policy initiatives such as lateral entry, we believe the civil-military divide can be bridged, even in this era of rapid technological shifts, greater military lethality, and an increasingly disconnected public.
This article has been updated to incorporate a more accurate comparison between the number of troops who participated in the 1991 and 2003 Iraq invasions.
Torey McMurdo is a political science Ph.D. candidate at Yale University, a Navy officer, and an incoming fellow at the US Naval War College. Previously, she worked with leading global technology companies as a strategy consultant in Silicon Valley. Christopher Hocking is an Air Force officer and faculty member at the US Air Force Academy. He specializes in cyber and acquisitions strategy and holds a Masters in American Government and Political Thought from the University of Virginia. He is the Executive Director of the Cyber Conflict Studies Association and served in Afghanistan. These opinions are solely those of the authors and do not represent the Department of the Navy, Air Force or Defense.