The Future of Algorithmic Warfare: Fragmented Development

JensenArt

Editor’s Note: What follows is an excerpt from the authors’ forthcoming book, Information in War: Military Innovation, Battle Networks, and the Future of Artificial Intelligence.

The current promise that artificial intelligence/machine learning (AI/ML) capabilities will revolutionize warfare stands in contrast to the historical record of high hopes and stalled progress of algorithmic warfare in the profession of arms. Despite the novelty of new chat bots and battlefield proliferation of AI/ML in Ukraine, the extent to which large, legacy military organizations — read the U.S. armed services — will prove adept at cultivating change is still uncertain. 

To that end, we wrote a book — Information in War: Military Innovation, Battle Networks, and the Future of Artificial Intelligence — about how military organizations adapt new information technologies, which includes modern battle networks and AI/ML. We did so because as scholars and military professionals we have witnessed stalled progress up close. In the book, we offer four alternative futures about the prospects for AI/ML in the U.S. military. Each is written as a slice-of-time scenario following the chairman of the Joints Chiefs of Staff on their morning commute. The logic of each scenario maps the findings from historical case studies in the book exploring how and why the military profession has struggled to integrate new information technologies to build battle networks over the last one hundred years. 

 

 

In this first scenario we show that bold innovators can find their path blocked by bad bureaucracy. Even when ideas about change are in the air — like today with the excitement around AI/ML including generative approaches and large-language models — progress can still be stalled by antiquated organizations and old routines. No amount of technology can substitute for an unwillingness to experiment and adapt legacy force design to take advantage of new information technology, including AI/ML. New gadgets die in the iron cage of outdated bureaucracy.

* * *

The year is 2040. The chairman of the Joint Chiefs of Staff rides to the Pentagon in her unmanned car, browsing a roundup of defense news, while listening to a historical podcast on the politics of the Army in the 1990s. As the narrator talks about Gen. Gordon Sullivan and efforts to digitize the Cold War Army, three headlines catch her eye and pause her news scrolling:

“Air Force Colonel Questioned over Swarm Experiment Challenging the Current Design of Fighter Squadrons.”

“New Report: Chinese AI-Driven Simulators Rely on Scraping Social Media to Train Models on U.S. Military Doctrine, Decision-Making.”

“Army Surgeon General Joins American Medical Association Lawsuit Against Hippocrates Application and Other Digital Physician Services.”

She asks her digital assistant to summarize a recent Congressional Research Services report on the status of AI/ML integration across the U.S. military. The bottom line hadn’t changed: Legacy organizational structures and bureaucracies larger than they are nimble limited the ability of the U.S. military to truly harness the power of AI/ML to revolutionize warfare. There were innovators who both shaped and used policy directives to run local experiments. New accounting lines were even being created to fund it all. But, just like the Air Force colonel, innovators were often being held back by arbitrary rules and a cadre of institutionalized souls. This older generation of retired colonels turned federal employees seemed more interested in preserving the status quo than changing it. Why question a bureaucracy and career paths that made you successful?

The services were willing to experiment or at least put on grand plays about it, innovation theater at its finest. Hell, there was money on the table! But the deeply-rooted structures — the staffs, corps, the divisions, the battalions and squadrons, the very nature of the rank structure — remained little changed since Napoleon. The military was an antique iron cage. 

More traditional senior leaders, who tended to populate service leadership, claimed there was an enduring essence to combat that made corps, air wings, and other large staff structures vital to coordinating effects in warfare. They saw the benefits of increased precision and lethality but preserved a view of judgment as hierarchal and human, often in a pre-Copernican manner revolving around themselves — “the Commander” — as opposed to data aggregated and analyzed across a larger battle network. Machines could help to filter information and even propose courses of action, but the old men on horseback still preferred “green tab to green tab,” commander-to-commander interactions and delegating to staff absent the watchful eye of AI applications. It was an almost romantic view of a tragic enterprise: killing in the name of politics. 

The general reflected on her own career. She remembered being detailed to the Joint AI Center as a young captain during the 2020 pandemic, working alongside vendors and scientists to try to predict how the virus would spread and what likely logistical backlogs would emerge. She remembered turning down lucrative offers to leave the U.S. Army to work for a firm that trained visual classification programs that used social media posts to facilitate romantic matchmaking. She loved the coding but found most of the commercial applications trivial.

She was at the front end of a generational shift. Her peers were a mix of digital natives who either understood basic computing principles or just lived unaware in an endless stream of the interactions they facilitated. This produced a mix of skepticism and trust entirely based on an individual officer’s knowledge. Those who understood basic coding, statistics, and computing concepts understood the promise and peril of AI/ML, even its limits in solving military problems. Those that lived online simply tended to trust technology, from their data to their choices, and assumed that someone or something else would make it all work.

She was younger than the generals who still controlled the major combat commands and upper echelons of the services. Fast-tracked ahead of her peers on four occasions, she was also the first non-combat arms chairman of the Joint Chiefs of Staff. In hindsight, her career was a series of occupational dead ends that always seemed to open new doors. After the pandemic she was pulled into the Chief Data and AI Office and the Global Information Dominance Experiments to develop algorithms supporting dynamic targeting. Her work propelled ideas about scalable battle networks and data at the edge. She was an intelligence officer who ended up spending more time building programming infrastructure and leading coding teams than advising the old school commanders on what the enemy was going to do. She spent more time fighting against entrenched notions about intuition and trying to explain Big Data and things like Bayes’ theorem to fellow officers than she did on building old-fashioned military combined obstacle overlays and collection plans little changed since Desert Storm. 

The general was jolted from her memories as the unmanned car hit the curb as it swerved to avoid a homeless man wandering onto Army-Navy Drive in Arlington, Virginia. She remembered the shock of most Americans when they discovered that no amount of automation would end all traffic fatalities, much less theft or murder. Over the last twenty years, amazing strides in AI applications altered daily life and even how the military fought but it didn’t remove the greed, grievances, or emotions of people. It didn’t stop bureaucracies and legacy institutions from resisting change.  

 

It did change the art of war. Despite the survival of large, legacy commands, war was now often conducted by small groups in the shadows supported by swarms of unmanned systems and networks of analysts around the world combing through data and debating insights from models. Data became the new key terrain. Those that had data and could replicate patterns, even possible decisions, gained a position of advantage. Those that didn’t usually reacted based on outdated priors. Those that had data and nimble force structures could simultaneously swarm targets and collapse enemy systems. Those that had large, legacy footprints maneuvered more like hoplites and tank armies of old, set piece and deliberate. Modern war was less climatic and more a series of violent, punctuated equilibriums as one side thought they had a data advantage and could act. Only AI/ML applications could filter the mass amounts of information, but the military often strained to act, slowed by legacy force structure and staff processes.

As the unmanned car pulled into the Pentagon parking lot, staff greeted her and escorted the general to her office while her digital assistant began her classified brief. After her biometric scan entering the building, her feed — a neural implant — began telling her the latest classified updates prioritized based on an analysis of her prior reading and trending material across the national security enterprise. The system only activated once she was in the building and biometrically matched to ensure operational security. 

She experienced the last battlefield update from ongoing counterterror operations in Africa. The 101st Airborne Division Headquarters was forward, command and controlling a train and assist mission and providing support to a task force tracking down a fringe group believed to be backed by the Russian GRU and the latest in a series of shadowy private military contractors. She remembered using AI/ML to scrape Telegram posts after the Wagner Group’s failed mutiny in 2023. Looking at the new mission, the chairman wondered why the mission needed a division headquarters at all. She had done the same mission with just a few CIA operatives and special operators in the Middle East over twenty years ago, crunching data and coding on the fly. Sure, her kit was nothing like what they had access to today, but the effect was the same: scalable battle networks.

As she entered her office, a new feed emerged in her neural link. It was a message from the commandant of the Marine Corps. It was another swan song about the importance of large, manned infantry formations to preserve flexibility in future war. The chairman remembered how the optimism around force design in the U.S. Marine Corps was overcome by a network of retirees and zealots who saw themselves as Spartans. She knew a warning shot when she saw one. The general was really telling her that he had access to an almost cult-like constituency ready to overwhelm congressional offices with bot-assisted calls for her resignation if she pressed AI-force structure shifts that would reduce the size of large infantry organizations any further. 

She thought again back to Sullivan and his efforts to digitize the post-Cold War Army. She had to balance the speed of change against the counter-mobilization of a network of officials and vested interests. The fact that her generation — now starting to take senior commands — only partially understood technology made this task all the harder. Digital natives tended to compromise where information technology and data were concerned. They assumed technology always worked, even if there were hiccups along the way, and could be added to any institution or process to make it fast, stronger… whatever the quality de jure may be. They tended to push for keeping things like divisions and large staffs in place — hell, that’s how they got promoted! — and just add new algorithms to old processes to achieve new efficiencies. Sullivan’s fate, she thought — that of a stalled change — was more likely than not her own.

* * *

As the scenario implies, there are reasons to be concerned that the current wave of enthusiasm around AI/ML will collide with legacy bureaucratic structures. The result: Old ways of waging new wars could produce diminishing marginal returns in a defense bureaucracy already strained by rising personnel costs, inflation, and challenges in passing even a basic audit. Successful states don’t waste blood and treasure. 

Military innovation does not emerge from a vacuum. It requires a diverse network of people, career pathways, and institutional antecedents. Technology can catalyze but not create change absent open institutional pathways for experimentation and change. As the scenario implies, there is a future on the horizon in which no amount of private sector AI/ML innovation can overcome the inability of the military to rethink its structure to take advantage of new way of war. Without structural change, new technology at best produces diminishing marginal returns. Like in the cautionary tale of tanks in the interwar period, new equipment gets added to infantry platoons as opposed to creating new armored formations.

The historical precedent to the future scenario above is the U.S. Army of the 1990s. Visionary leaders like Sullivan imagined new concepts and capabilities for an information-age military. Sullivan pushed to change the bureaucracy and accelerate that change but struggled to overcome deep-seated interests and balance the demands of generating rapidly deployable forces for contingencies in Africa, the Middle East, and the Balkans and designing new unit types. His vision of war in the information age, while prescient, failed to change the army at the speed and scale he had hoped for. Change did come, but it took another generation.

It is too soon to say whether the latest efforts to usher in a new era of algorithmic warfare will face a similar fate. There are reasons to be optimistic. The chief digital and artificial intelligence effort is creating a diverse network of military professionals, civilian officials, and private sector firms ready to change warfare. Service level initiatives like Scarlet Dragon are similarly pushing ahead. At the same time, the fate of Combined Joint All-Domain Command and Control could cause the whole wave of experimentation to fizzle out, with different services backing different solutions creating a bureaucratic coordination challenge and limiting the likelihood of real military revolution.  

The true test of whether the latest push for innovation will succeed is whether or not military organizations start to build new command and control constructs and entirely new formations. If the readers of War on the Rocks start to see articles along these lines turned into field experiments and entirely new units, the possibility of change starts to emerge. If they find glossy think tank reports and LinkedIn posts about new code injected into old bureaucracies, they should not lose hope. Change will come. It will just be slow, uneven, and prone to more half-steps and missteps than real success.

 

 

Benjamin Jensen, Ph.D., is a professor of strategic studies at the School of Advanced Warfighting in the Marine Corps University and a senior fellow for future war, gaming, and strategy at the Center for Strategic and International Studies. He is also an officer in the U.S. Army Reserve.

Christopher Whyte, Ph.D. is an Assistant Professor of Homeland Security and Emergency Preparedness at Virginia Commonwealth University.

Colonel Scott Cuomo, Ph.D., currently serves as a senior U.S. Marine Corps advisor within the Office of the Undersecretary of Defense for Policy. He helped co-author these essays while participating in the Commandant of the Marine Corps Strategist Program and also serving as the service’s representative on the National Security Commission on Artificial Intelligence.

Image: AI Generated by Dr. Ben Jensen

LinkedInBot/1.0 (compatible; Mozilla/5.0; Apache-HttpClient +http://www.linkedin.com)