The Big and Urgent Task of Revitalizing Nuclear Command, Control, and Communications
Perhaps we were naïve, but my coauthor and I did not expect our article in War on the Rocks — “America Needs a ‘Dead Hand’” — to generate such great and mostly critical interest. Giving the article a provocative title was an effort to draw readers into a topic — nuclear command, control, and communications (NC3) — that is often overlooked. And even though we chose the title, the argument we made was far more nuanced than just those few words.
Unfortunately, from the critiques that followed in the weeks after publication, we were unsuccessful in our objective of generating real discussion about America’s outdated NC3 system and how best to overcome the challenge presented by new nuclear weapons developments — Russian, in particular. Instead, our article was turned into a straw man and burned at the stake for heresy.
We certainly don’t mind an open debate, but hope to clarify some key points that we worry were lost in the minor controversy that followed.
Nuclear Command, Control, and Communications
The fundamental premise of our article is that technological advancements in conventional and nuclear forces are reducing the time available for senior decision-makers to progress through the nuclear command and control process (detect, decide, and direct) prior to the striking of American military and civilian targets by a peer adversary. We call this “attack time compression.” The shrinking of decision time, as we note, is not new. It is a process that the NC3 system has adapted to over the past 60 years or more as new delivery vehicles that could more rapidly strike American targets were developed.
Our recent experience designing and fielding courses on this subject for the Air Force NC3 Center left us concerned that the existing system, including ongoing modernization efforts, may not provide senior decision-makers with adequate time to evaluate an inbound strike, progress through available options, and direct a response before both the thick line (peacetime NC3 system) and thin line (survivable post-attack NC3 system) fail. This time compression vulnerability is most concerning.
We are not alone in our concern for the survivability of the NC3 system. Retired Lt. Gen. David Deptula, former Assistant Secretary of the Air Force for Acquisition William LaPlante, and Robert Haddick recently wrote one of the few unclassified studies addressing some of the challenges the system faces. They too found that the NC3 system suffered from benign neglect. This is because the command and control systems of the larger NC3 architecture are too easily taken for granted. They are mundane — yet critical — communications systems. Without a large-scale modernization of the NC3 system, they argue, the United States cannot guarantee successful nuclear deterrence. We largely agree with their assessment. However, in our article, we sought to take an unbounded look at the problem and possible solutions, which allowed us to contemplate something as radical as an artificial intelligence-informed NC3 system.
We concluded that replacing existing component systems — within the larger NC3 system — with modernized command, control, and communication systems may not adequately reduce the time required to detect an attack, decide on a response, and direct a response. Even with a “modernized” NC3 system, survivability in the face of hypersonic glide vehicles, cyber operations, electro-magnetic pulse, and low-observable conventional or nuclear cruise missiles remains a major concern.
We disagree with those in the arms control community who suggest these new capabilities do not pose a substantial threat to the United States nuclear triad, NC3 system, or other supporting forces that make a second-strike response possible. Our primary objective in posing four alternatives to the current approach — an artificial intelligence-based NC3 system being one of them — is to highlight what we see as a growing problem in need of a solution that is different from the current modernization plan.
We understand that our analysis runs counter to the desire of arms control advocates to reduce the size of American nuclear forces and decrease funding for nuclear modernization. Happily, our intent was never to convert arms control advocates, but rather to bring attention to what may be the least well understood part of the U.S. nuclear force among those who are not opposed to modernization.
For many of our critics, conceding to the notion that the NC3 system may not be as robust as many Americans assume is heresy. But we are of the view that this system needs to be completely redesigned. If that makes us heretics, so be it.
Artificial intelligence is a technology that is certain to have a profound influence on the future of warfare. Its integration into a wide array of military systems, including NC3, deserves the kind of investment and attention America’s rivals are assigning to its development. Artificial intelligence, robotics, quantum computing, and a few other technologies are set to play a central role in a “third offset” if the United States can lead in their development and be first movers in those areas.
Russia and China are challenging the United States in the development of these technologies and their integration into critical defense capabilities. Their development can and will impact strategic stability and the viability of existing nuclear forces.
It is easily conceivable that attack-time compression will reorder the detect-decide-direct process to a decide-detect-direct process. If that occurs, AI will play a role. Soon, the hand ringing and recriminations of today may be irrelevant if strategic competition with Russia and China, who are unencumbered by Western values, force the United States to incorporate technologies that current leaders confidently suggest will never happen. If my own recent experience has taught me anything, it is that fear and self-interest trump morality — with few exceptions.
Our objective in offering artificial intelligence as one of four possible solutions to the NC3 challenge is because we see it as the central technological development set to shape the future over the coming decades. Developing and fielding militarily advantageous artificial intelligence before Russia or China do the same is critical. Neither of these regimes has moral qualms that will constrain its actions. They are dedicated to changing the global status quo, which distinctly advantages the United States. Thus, they are seeking to take the lead in developing artificial intelligence and other militarily advantageous technologies. Both Vladimir Putin and Xi Jinping understand the wisdom of the great Ricky Bobby, who famously said, “If you’re not first, you’re last!”
Our critics claim we want to turn over nuclear decision making to artificial intelligence devices such as Amazon Echo. While this may be humorous, it is detrimental to actually addressing the challenge the United States faces. We freely acknowledge the current limitations of artificial intelligence technologies, but we do not agree that these same limitations will necessarily exist one, five, or ten years hence. For skeptics of AI, there is an expectation that it must be perfect 100 percent of the time if it is to replace people. Yet humans are not held to the same standard. They are deeply flawed and prone to mistakes from stress, group think, bias, and other cognitive challenges. We have avoided nuclear miscalculations and mistakes not by ensuring individual perfection but by taking a layered approach to security. The same would be required with any AI.
Artificial intelligence certainly has hurdles to overcome, but across many scientific disciplines, the once impossible is now reality. When the Apollo lunar module landed on the moon on July 20, 1969, less than seven years had passed since President Kennedy challenged the nation to go to the moon. We expect artificial intelligence and related technologies to follow a similar path of rapid advancement. The reality of our time is that we are on the cusp of major innovations that will challenge our current approach to nuclear operations and war more generally.
Our sole concern is achieving the desired end-state — strategic stability and American supremacy, which are not mutually exclusive. We believe that trends in technological development will ensure artificial intelligence plays a central role, but the exact shape of the future is yet to be determined.
I would ask those who are serious about ensuring the survivability of the American nuclear deterrent to join us in thinking about new approaches to guaranteeing that its adversaries never, even for a moment, doubt that the United States can command and control its nuclear forces under any set of conditions. Whether it is through an artificial intelligence-based NC3 system or some other means will be decided over the coming decade — and only after several technologies reach maturation.
Dr. Adam Lowther is professor of political science at the U.S. Army’s School of Advanced Military Studies (SAMS) at Fort Leavenworth. He previously served as the founding director of the School of Advanced Nuclear Deterrence Studies (SANDS) and at the Air Force Research Institute. Lowther served in the U.S. Navy. The views expressed here are not those of the School of Advanced Military Studies, the U.S. Army, the Department of Defense, or any part of the U.S. government.