Can Warfighters Remain the Masters of AI?

AI NPS

Editor’s Note: This article was submitted in response to the call for ideas issued by the co-chairs of the National Security Commission on Artificial Intelligence, Eric Schmidt and Robert Work. It addresses the second question (part b.) on the types of AI expertise and skill sets the national security workforce needs.

 

The Department of Defense is engaging in a dangerous experiment: It is investing heavily in artificial intelligence (AI) equipment and technologies while simultaneously underinvesting in preparation for the workforce will need to understand its implementation. In a previous article, Michael Horowitz and Lauren Kahn called this an “AI literacy gap.” America’s AI workforce, in uniform or out, is not prepared to use this fast-advancing area of technology. Are we educating the masters of artificial intelligence, or are we training its servants?

 

 

The U.S. government as a whole, and by extension the military services in particular, are flush with “AI mania.” One could be forgiven for thinking that dominance in AI is today’s preeminent military competition. It is certainly true that advances in technology — including AI — will be drivers of national success in the future. However, we are currently in the “boost phase” of excitement about AI. From our perspective as cutting-edge practitioners and educators in the field of statistics as applied to military problems, it is almost certain that the expectation for AI in the mid-term will not be completely met. This interplay between inflated expectations, technical realities, and eventual productive systems is reflected in the business world as well and is described as part of the Gartner “Hype Cycle”.

Figure 1: Gartner Hype Cycle. Notably, AI technologies are on the “upswing” of hype. How will the Department of Defense position itself with a particular eye toward manpower to survive the inevitable “crash”?

The current hype about AI is high and is likely to be followed by a crash. This is not a statement about the technology or even the U.S. government so much as human nature. The more promising the technology, the harder the eventual crash will be, before entering the productive phase.

As an example of the disconnect between technologies and manpower, the Defense Department recently added “data scientist” to its job series descriptions, although a universally accepted definition of what a data scientist is remains elusive. Our working definition of a data scientist is someone who sits at the intersection of the disciplines of statistics and computer science. On the back side of the curve, the glacial pace of Defense Department budgetary programming means that current AI initiatives will be around for the long haul, and that means that there will need to be a cadre of individuals with the requisite education to see us through the hype cycle’s “trough of disillusionment.”

At the same time, the Navy in particular is shedding its AI-competent manpower at an alarming rate. By AI-competent manpower we mean operationally experienced officers with the requisite statistical, computer programming, and data skills to bridge advanced computing research into combat relevant, data-driven decisions.

We have observed several trends to support this assertion. Navy officers directly involved in operations and eligible for command at sea (unrestricted Line officers) taking the Naval Postgraduate School’s operations analysis curriculum — mathematics applied to military problems focusing on statistics, optimization and associated disciplines — has decreased dramatically in the past 10 years. For example, the last U.S. naval aviator to graduate with the operations analysis subspecialty was in 2014. The Navy’s assessment division (OPNAV N81) — the sponsor for the operations analysis community — has also recognized this trend and directed the creation of a tailored 18-month program for unrestricted line officers, with the objective of gaining more analytical talent in the fleet. Other Navy communities, such as information warfare, are only now recognizing the need for officers educated in the programming, statistical, and analytical skills needed to fully develop AI for naval warfare, and are beginning to send one or two officers annually to earn operations research degrees. We are personally aware of at least two cases where flag officers became directly involved in the detailing of officers with an operations research or systems analysis specialty. What is interesting about these cases is that these officers are considered “unpromotable” from their unrestricted line communities of origin – that is, these officers spent career time on in-depth education, and are frequently penalized for it.

We write in the contexts of our roles as professionals, as well as retired naval officers and frequent commenters on defense policy. As such, it is our firm opinion that the Navy’s future with artificial intelligence rests critically on the natural intelligence that enables and guides it.

First, It’s About Perspective

The true challenges to AI lie in policy, not technology. What is the impact of AI, and what is the right historical parallel? Many organizations both in and out of government reason that AI is a “big computery thing,” so it should go with all of the other big computery things, which frequently means it gets categorized as subordinate to the IT department. Although IT infrastructure is a necessary component for artificial intelligence, we think that this categorization is a mistake. It is clear to us that in the coming era, the difference between “warrior” and “computer guy” may become blurred to the point of non-existence. An excellent historical example is that of Capt. Joe Rochefort, who was considered — derisively — at the time to be what we might now call a “computer geek” but who, in retrospect, was one of the architects of victory at Midway — and by extension the entire Pacific theater.

We think that a useful historical parallel to draw with the broad introduction of AI into the service is the introduction of nuclear power to the Navy some 65 years ago. It would have been an unthinkable folly to stop the education of nuclear-qualified engineers while introducing the USS Nautilus and USS Enterprise to the fleet. This is, in so many words, the Navy’s current strategy toward technical education for its officers at the dawn of naval AI.

Similarly, while there are many offices working on AI in the Navy, there most likely needs to be a single strong personality — like Hyman Rickover for the nuclear Navy and Wayne Myers for Aegis — who will unify these efforts. This individual will need to understand the statistical science behind machine learning, have a realistic appreciation for its potential operational applications, and the authority to cultivate the necessary manpower to develop and use those applications.

Next, It’s Manpower

Echoing other writers in these pages, it may seem paradoxical that the most important component to building better “thinking machines” is better thinking humans. However, the writing is on the wall for both industry and government: The irreplaceable element of success is to have the right people in critical jobs. The competition for professionals in statistics and AI is tight and expected to become tighter. Simply put, the military will not be able to compete for existing talent on the open market. Nor can the open market provide people who understand the naval applications of the science. As with nuclear power, in order for the Navy to successfully harness AI, it needs sailors who are educated to become its masters — not trained as its servants.

There is a shortage of people working in the fields of applied mathematics, specifically AI, and nobody will truly know how the systems developed now will react when eventually deployed and tested in situ — on board actual ships conducting actual operations. The ultimate judge of the Navy’s success in AI will be the crews that man the ships on which it is installed. It is our observation that the technical depth of these crews is decreasing as time progresses. This is why it is critical that the services — particularly the Navy — “grow their own” and build a cadre of professionals with the requisite education and experience (and who happen to be deployable). Aviators, information warfare officers, submariners, and surface officers should be inspired to obtain technical, professional, and tactical analytical skills to best apply future AI at sea. One cannot help recalling a historical analogy — learning how best to apply “new” radar technology during the nighttime Battle of Cape Esperance in October 1942. In this battle, Adm. Norman Scott and his battle staff were not in the ship with the best radar capabilities, which resulted in confusion as to enemy locations and friendly identification. A better knowledge of this new technology may have resulted in its more efficient employment.

What will inspire officers to gain the skills to serve as masters to AI and subsequently resist seduction by the private sector, remaining in the service instead? Organizational recognition of their value through promotion within their operational fields and opportunities to perform at a higher level much faster than they would find in the outside market. This is, sadly, not the practice of naval personnel actions. Paradoxically, time spent away from warfare communities gaining advanced education skills in areas such as those needed to be a master of AI is currently seen as “dead time” at best, and a “career killer” at worst. In the near future, the use of advanced algorithms to guide warfare and operational decisions will no longer be a subspecialty but rather an integral part of the warfighting mission of the Navy. Accordingly, moving away from the educational “quota” system, derived from subspecialty requirements, is a solid first step. In its place should be a Navy educational board to select due course officers for specific educational programs that will shape the Navy’s future, not meet the requirements of the current staffs.

When the Navy introduced nuclear engineering, it established a nuclear engineering school to meet its manpower requirements. When the Navy introduced the Aegis combat system, it established a dedicated Aegis school to meet its manpower requirements. The difference between these historical examples and AI is that AI does not need the same physical safeguards as radioactive materials and high-power radars. The Navy currently has the ability to better prepare its AI workforce through multiple institutions and methods — both military and civilian — including the Naval Postgraduate School, civilian institutions, and fellowships. Programs exist in these institutions that provide the programming, mathematics, and computer science skills needed to gain a deep appreciation for AI technology. Better incentivizing and using the tools already in place will allow sailors to use AI science for warfighting advantages. Where possible, the Navy should partner with industry and outside academic institutions to augment military experience with the lessons being learned commercially, resulting in a technical education with an operational purpose.

AI technology is maturing and the educational programs exist. AI technology exists. The critical element is the sailors who are going to be its masters for integration and deployment. These challenges may be solved internally by policy — not externally with technology. It will ultimately be those policies that determine the success of the fleet.

 

 

Harrison Schramm is a non-resident senior fellow at the Center for Strategic and Budgetary Assessments. While on active duty, he was a helicopter pilot and operations research analyst. He enjoys professional accreditation from INFORMS, the American Statistical Association and the Royal (U.K.) Statistical Society. He is the 2018 recipient of the Clayton Thomas Prize for contributions to the profession of operations research and the 2014 Richard H. Barchi Prize. As a pilot, Schramm was awarded the Naval Helicopter Association’s Aircrew of the Year (2004).

Jeff Kline is a professor of practice in the Naval Postgraduate School Department of Operations Research. Kline supports applied analytical research in maritime operations and security, tactical analysis, risk assessment and future force composition studies. He has served on the U.S. chief of naval operations’ Fleet Design Advisory Board and several naval study board committees of the National Academies. His faculty awards include the Superior Civilian Service Medal, 2019 J. Steinhardt Award for Lifetime Achievement in Military Operations Research, 2011 Institute for Operations Research and Management Science (INFORMS) Award for Teaching of OR Practice, 2007 Hamming Award for interdisciplinary research, and 2007 Wayne E. Meyers Award for Excellence in Systems Engineering Research.

Image: Naval Postgraduate School (Photo by Javier Chagoya)