Technology and Future War Will Test U.S. Civil-Military Relations

rbrooksnov

Relations between America’s civilian and military leadership are going to face serious new challenges as the U.S. military prepares for future war. In coming years, the Pentagon will make portentous decisions about how to adapt transformative technologies like artificial intelligence (AI), robots, and human augmentation. The potential for change is enormous.

AI is an enabling technology that can be used in diverse domains of military activity — everything from weapons systems, intelligence, logistics, and training to the learning tools employed in professional military education. When combined with robots, AI will increase the ability of machines to operate autonomously. With advances in robots, computing and neuroscience, military personnel will increasingly be able to compensate for their cognitive and physical limitations with biotechnology and implantable devices. These technologies will fundamentally reshape the character of war, if not — as some have speculated — its very nature.

It’s no surprise then that today so many are talking about technology and the future of warfare. So far, a lot of the conversation has focused on things like weapons systems and increasingly the important ethical issues that they pose in future war. But adapting these technologies has larger implications for relations between military and civilian leadership on several fronts — for how they weather pressures for organizational change and jurisdictional threats to the profession of arms, and how they address new challenges to civilian oversight and military advice. Alongside the transformation in the character of warfare, comes a major new era for the U.S. military and the country’s civil-military relations.

Organizational Change Tests Civil-Military Relations

One big challenge relates to the growing pressures for organizational change that emerging technologies will inevitably produce. The military is facing some hard questions about how it will adapt its culture and institutions to exploit new technologies — and civilians face a tough job ensuring they answer them effectively.

To see this, consider how the role of humans in tactical level warfare could be impacted. As technology advances, many of the tasks that humans now control in the battle space may increasingly devolve to machines. Today, a field commander might direct a system to attack a target or help coordinate logistics. In the future, as autonomous systems become more sophisticated, those inputs might be become more expansive and be issued by leaders at higher levels in the chain of command. A commander of a carrier strike group might, for example, tell an AI enabled system to simply “protect the carrier,” or a brigade commander might tell a machine to “protect the force.” These orders would bypass subordinate commanders who today coordinate the tactical defense of the carrier or unit.

The integration of unmanned systems using AI in “human-machine teams” will force other changes. With the growing use of AI, robots, and the cognitive and physical augmentation of military personnel, fewer humans will be needed to build mass. Some argue that what now requires major deployments of military personnel could be accomplished in the future with a much smaller force as operational concepts such as the robotic “wingman” are adopted.

It’s not much of a stretch to imagine that changes like these are going to be hard for parts of the military to accept — especially some in the combat arms. The integration of new technology will test the identity and culture of military organizations in which humans are central to tactical warfare. Increasing reliance on machines might also require new thinking about command authority and organizational structure. Unknown is whether civilian innovators or the military mavericks they support will be able to deal with the pushback from those resistant to organizational change. More certain is that confronting these challenges will put new stresses on civil-military relations.

Civilian Oversight of Military Activity Gets Harder

Civilian oversight could also get a lot more complicated. It may become harder for Congress and civilians in the Pentagon to oversee military activity. The barriers to entry in the defense establishment will intensify as civilian political appointees will have to have technical training to effectively carry out oversight. They also may have a hard time challenging and evaluating military activity, or testing the arguments made by military leaders based on AI.

Reliance on AI and complex technology also exacerbates the information asymmetry about military affairs already present between military and civilian leaders. The mystification of military activity will accelerate as it becomes increasingly difficult to explain the whys or wherefores of machine assisted outputs. Technology may also magnify existing tensions in civil-military relations. The civilian leadership may have even more reason to fear that the military is obfuscating or hiding behind the technology, or otherwise trying to take advantage of it to privilege a preferred strategy or policy outcome. The information asymmetries created by technology create new opportunities for shirking and with them, new tensions between military and civilian leaders. This means that the underlying trust and rapport developed between civilians and their military counterparts will become even more important than they are today.

Weighty issues related to the future of warfare could also complicate civilian oversight. Today the Pentagon maintains that “the human is always in front in terms of DoD thinking.” But pressures are going to mount to allow machines to make more and more decisions autonomously in armed conflict — especially as the country’s adversaries increasingly allow machines to fight their wars with little human supervision. One day, this might even include allowing those systems to decide who to kill in the course of a military operation. Civilian and military leaders might not always agree on whether or when delegating such momentous decisions to robots is a good idea. Military commanders may be willing to take risks so they can capitalize on the many benefits that relying on machines provides. Civilians might worry about the strategic and political blowback of machines playing a larger role on the battlefield — let alone killing people. The inevitable tensions associated with civilian oversight of military activity could get much worse.

Civil-Military Advisory Processes Get Complicated

Civil-military advisory processes are also going to get more complicated. AI-assisted decision-making increases the human grasp of complex environments while mitigating the distorting effects of individuals’ emotional and cognitive biases. Senior military leaders may in the future increasingly rely on AI-assisted systems as “strategic counselors” as they assess military options.

Machines as adjuncts in strategic assessment could be hugely beneficial. But military leaders are going to have to use these systems wisely if they are going to rely on them to advise civilians. Decisions generated through AI and machine learning are not bias free. They reflect the character of the human-generated data upon which they rely. Military leaders’ training and education in professional ethics, and ongoing efforts to cultivate character and judgement among officers are going to be paramount.

Even with that preparation, military leaders’ reliance on AI could create problems in relations with civilians in strategic assessment. What happens when a military leader’s advice seems far-fetched to civilian ears? The historic Go match between Google’s AlphaGo and Go expert Lee Sedol played in South Korea in 2016 is instructive. At one point in the match’s second game, which AlphaGo ultimately won, the computing system made a move so beyond explanation that Sedol left the room to contemplate its meaning. Faced with a similarly inscrutable machine-generated recommendation in the military domain, civilian leaders may be skeptical. They may discount military advice or seek out their own inanimate strategic counselors as alternative sources of guidance. Either way, the challenges of providing military advice are likely to grow in the future.

The Military Mobilizes Against the Threat to the ‘Profession of Arms’

One final set of challenges comes from the existential threat new technologies pose to the profession of arms. Civil-military relations in the United States are based on a division of labor in which the military retains a clear domain of responsibility and authority apart from the civilian sphere as experts in “the management of violence.” The military retains an exclusive sphere of expertise. This is a defining feature of the military profession.

But lines are increasingly blurring in future warfare. What counts as distinctively military expertise may shrink in relation to civilian sectors. One reason is the military’s need to cultivate close partnerships and work seamlessly with commercial technology firms and private sector experts as it adapts to future war. The striking cultural differences between military and tech culture make this harder. As Stephen J. Gerras and Leonard Wong observe, military culture exhibits a high degree of power distance and limited assertiveness among its subordinates. Tech culture is just the opposite. It is relatively intolerant of disparities of power in the organization and more accepting of subordinates challenging superiors’ arguments. Given this culture clash, interprofessional competition by the military with the civilian sector over jurisdictions may be inevitable.

Resistance to the shrinking of the profession of arms’ jurisdictional boundaries and exclusive domain of expertise once again could result in bureaucratic pushback and shirking. Even worse, it could encourage a full-scale political mobilization by some parts of the military in defense of “the profession of arms.” Perhaps it is time to re-read Andrew Bacevich’s compelling account of Gen. Matthew Ridgway’s extensive political campaign against Eisenhower’s massive retaliation strategy in the 1950s. Ridgway feared that the strategy would eliminate the military’s exclusive preserve over the conduct of war and for that reason worked ardently against it. It’s not hard to imagine some future general and flag officers mobilizing against an effort to create more seamless boundaries between the civilian and military sectors. These dynamics will be even worse if not just civilians in the tech sector, but the machines they create are viewed as part of the jurisdictional threat.

The Time Is Now to Consider These Issues

Given the many challenges facing the military as it adapts to rapid technological change, it might be tempting to put off thinking about fundamentals like civil-military relations. That would be a mistake. Analysts and practitioners should begin now to think seriously about what will be required to ensure that necessary organizational adaptation is enabled, not stymied. Of course, caution is essential when contemplating the integration of new technologies in the military domain. But organizational parochialism or fears for the profession’s future could easily distort that assessment. Anticipating and preparing for such obstacles is essential. Efforts must also be made to understand what will be involved in civilian oversight in the future. Better understanding of how to navigate new complexities in civil-military advisory processes are vital. Sage leadership and trust — the essential currencies of civil-military relations — are essential to manage these challenges.

 

Risa Brooks is Allis Chalmers Associate Professor of Political Science at Marquette University and an adjunct scholar at the Modern War Institute at West Point.

Image: U.S. Navy/MC3 Josue L. Escobosa

Do you like our articles?

Then you'll love this job opportunity! War on the Rocks is hiring another full-time editor. Help produce the articles you love to read.