Special Operations Forces Require Greater Proficiency in Artificial Intelligence
I tried to be an AI whiz,
But tech knowledge I did lack;
So all day long I sat and cried
As I stared at the AI stack.
A one-sentence prompt is all it took for an artificial intelligence model, Generative Pre-trained Transformer 3, to craft a poem about my struggle to understand AI. Recent advances in AI have generated intrigue about its potential impact across a wide range of applications. More notably, the technology is increasingly accessible, which is exciting for non-technical people like me. Since its release in November 2022, ChatGPT — a specialized version of the bot that created the above stanza — has become an internet sensation. Anyone can use AI to summarize, translate, and generate human-like texts or write code in various programming languages. Because of this ease, technical knowledge seems less necessary for general users of AI.
This assumption carries over to the military, where operations are increasingly rife with AI. Warfighters are seen as general users without technical knowledge, while technical experts are tasked with managing and evaluating the technology. However, with AI increasingly pervasive in military operations, the Department of Defense will not be able to rely solely on these experts to understand AI systems. AI is intimidating, but non-technical professionals should still strive for a more intricate understanding of the technology. Warfighters, particularly those within special operations, should acquire greater in-depth knowledge of AI to effectively and responsibly use military technologies that will increasingly fuse with decision-making processes.
Many have written about the need for general AI education within the military, but this article takes this argument a step further and asserts that AI education is particularly important for special operations forces, who serve as incubators of new ideas, products, and processes. Special operations forces are not only positioned to be at the cutting edge of testing and evaluating AI, but they will be among the first to deploy these technologies in the operational environment. Therefore, the special operations community should advance greater AI competency at the tactical level.
Beyond One AI Archetype
Now is the time to seriously consider the inclusion of AI education across the special operations enterprise as the command conducts a review of professional military education opportunities as part of the 2023 National Defense Authorization Act. The importance of “cultivating Special Operations Force technical skills” was already identified in the 2022 National Defense Authorization Act, which states that “certain niche technical skills,” including machine learning and artificial intelligence, “are essential to the conduct of irregular warfare.” The act further calls for the Department of Defense to assess “efforts to grow the training infrastructure for their STEM [and especially AI] workforce.”
To this end, the 2020 Department of Defense AI Education Strategy serves as an important reference to identify appropriate AI education requirements within special operations forces. The AI Education Strategy outlines six archetypes within the Department of Defense workforce to tailor education and training requirements based on their AI-related role: Lead AI, Drive AI, Create AI, Embed AI, Facilitate AI, and Employ AI. For the majority of these archetypes, the roles are relatively straightforward. The Lead AI archetype is composed of senior leaders who determine the policies and doctrine necessary for responsible AI deployment. The Drive AI archetype consists of the acquisition, capability, and product managers who ensure the delivery of appropriate AI capabilities. The Create AI archetype includes AI engineers and data scientists who build the AI tool. Finally, the Employ AI archetype comprises end-users of the technology, representing most of the Department of Defense workforce.
The Embed and Facilitate AI archetypes are a bit more nuanced. These two archetypes lie between the Employ AI archetype (end-users) and the Create AI archetype (AI developers). The Embed AI archetype “deploys, maintains, adapts, and collects data for AI/ML systems at the tactical edge.” They “support use case development” and “solve AI application issues down-range to maintain functionality.” Those within the Facilitate AI archetype represent the end user’s needs and work with AI developers to refine requirements. While the Department of Defense AI Education Strategy lists technicians, product owners, user experience designers, and other technical experts as individuals who would fall within the Embed and Facilitate AI archetypes, these roles do not directly translate into the current military workforce. In other words, it is not always clear who can or should bridge the gap between AI developers and general users.
Nonetheless, what is clear is that special operations forces require skills, education, and knowledge beyond those of a general user within the Employ AI archetype. With greater access to rapid prototyping and testing, these warfighters often interact directly with product developers to iterate upon the design, functionality, and usability of new tools and equipment. They also often operate at the tactical edge in semi- to non-permissive environments and, thus, require the ability to troubleshoot their equipment unassisted. In these situations, the operator will require the skills that pertain to the Embed and Facilitate AI archetypes. They will need to understand data streams, troubleshoot machine learning models, and communicate end users’ needs to technical experts supporting them from the rear.
Further, as the special operations community continues to experiment, test, and acquire a growing number of AI-enabled systems, more individuals will be involved in AI projects. These projects are more fluid and iterative than traditional project lifecycles. As AI permeates more aspects of special operations, operators may be called upon to assist with project planning, including defining the problem and determining the most appropriate AI approaches. Additionally, they may be involved in testing and evaluating an AI system, which will require more in-depth knowledge than a typical user. Therefore, special operations teams should understand AI project management, data management, and performance metrics.
An understanding of AI and its metrics does require a foundation in math and statistics. Concepts such as true/false positives and negatives, confidence intervals, and error metrics should be familiar friends. Special operations users need to be able to assess whether the data being fed into the AI system is relevant, current, and representative of the actual operational environment. When AI developers discuss the performance and maintenance of the system, special operations personnel should be knowledgeable enough to work with the developer to fix unaddressed concerns. This competency will help the rapid and efficient development and improvement of AI systems and tools for the military.
Of course, the responsibility for making AI performance metrics comprehensible for users is not the sole responsibility of special operations personnel. AI developers have an obligation to make the technology as transparent and explainable as possible. AI systems should be able to provide context or accompanying reasons for their outputs. However, despite the Defense Advanced Projects Research Agency’s efforts to put forward new explanation techniques, determining how much to explain to a user remains an intractable challenge. Moreover, how best to display the results of an AI system will likely vary by use case and intended users. Therefore, close collaboration between AI developers and special operations forces is necessary to ensure that AI technologies are being used as efficiently and effectively as possible.
Although soldiers may be able to drive tanks or fire artillery without any mechanical expertise, special operations forces will require sufficient technical knowledge when using AI-enabled systems because the technology offers a fundamentally unique capability. Special consideration is required given the significant impact that AI can have on human decision-making and the way wars are carried out. With AI becoming increasingly complex and ubiquitous, users will find that examining only the output or, more simply, whether the tool works is an inadequate evaluation of the AI system.
AI technology cannot be evaluated based simply on whether it seems to work. Ascertaining whether AI is “working” can be very difficult, especially when dealing with massive amounts of data at machine speed. While an AI-enabled tool may be able to generate a visually stimulating briefing product that depicts narrative trends and sentiment analysis, its analysis might be useless or, worse, inaccurate. Underlying data and decision thresholds shape these assessments. Further, model drift — degradation in performance due to changes in the environment — and adversarial manipulation can further complicate the use of AI. Algorithms can be brittle. Data can get stale. Adversarial attacks may poison data and cause imperceptible but consequential changes to an AI model. These challenges are yet another reason why the effective use and maintenance of AI systems call for an understanding of data and model performance.
AI performance metrics, of course, can get rather complicated. Complex AI systems involving multiple models or foundation models can introduce new challenges in evaluating performance. Further research is needed to determine appropriate methods of communicating to users how to operate these more complex systems. No matter how “indistinguishable from magic” the AI may seem, operators should never blindly trust the system’s performance. Even with “simpler” metrics, users may not understand the full context of what the metric is assessing. Nonetheless, an understanding of these performance metrics is critical for operators because they provide some level of standardization and objective measure to determine the value and status of an AI tool. While higher quantitative metrics do not necessarily mean that a tool is better, low performance scores do hint at a diminished value.
Formalizing Required AI Skills Within Special Operations Teams
Admittedly, ensuring that every single operator has the necessary level of AI knowledge to supplement technical expertise will be difficult. At the very least, the military should ensure that each special operations team has at least one individual with more advanced AI knowledge who can, at a minimum, fulfill the tasks of a Facilitate AI archetype and take on some of the responsibilities of an Embed AI archetype. These operators do not need to be full-scale programmers. Rather, they should be “maintainers” of the technology to ensure the continued functionality of the AI system downrange. This skill set will become increasingly necessary for special operations forces as they use more advanced AI-enabled systems requiring human-machine teaming in deployed environments.
These AI-focused team members should be able to identify key areas of concern when using and evaluating emerging AI tools and provide informed feedback to AI developers to facilitate continual improvement. Individuals with more advanced AI knowledge can act as a bridge between technical experts, often embedded in headquarters, and other members of their special operations team at the edge. Even as the military pushes for greater integration of data scientists at lower levels to provide direct assistance, there will be a steep learning curve for all parties as they try to understand their respective roles. Having someone in the operational team who understands the language of data, algorithms, and models will be critical to working with data scientists.
The inevitable challenge is finding the time, personnel, and resources to educate and train warfighters toward this higher level of AI competency. In addition to a lengthy qualification pipeline, special operations forces have to complete a laundry list of training mandated by the Department of Defense, their service component, and occupational specialty. Adding another training requirement may lead to frustration within the community. Nonetheless, leaders should view AI training and education as a priority because it will be an intrinsic part of warfare. Advantage on the battlefield is not generated solely by technology but also by the ability of warfighters to use this technology. If special operations forces want to fully employ AI applications in the military and drive their evolution, a deeper understanding of the technology is needed.
While there is a push for increased AI training and education across the Department of Defense, this initiative has yet to be implemented fully at the tactical level. Currently, individuals with an interest in AI can take the initiative to educate themselves on the technology, but broader access to AI educational opportunities remains limited. Rather than relying on ad hoc initiatives, special operations forces should create an additional skill identifier or certification requirement for those selected to work more intimately with the technology. Formalizing and expanding technical understanding at the unit-of-action level is critical to maintaining an advantage in future warfare.
As the Department of Defense AI Education Strategy outlines, AI training and education can be a combination of asynchronous and synchronous learning. Therefore, building advanced AI skills does not necessarily mean lengthening the special operations qualification pipeline. Instead, select personnel would undergo self-paced AI courses supplemented with boot camps and workshops. The key is to define a position at the team level that requires advanced AI skills, which would then allow selected individuals to pursue AI education, rather than requiring the competency across the board.
Some may argue that rather than establishing AI knowledge as a core competency within the special operations forces, AI skills should come from “enablers” who directly support these teams. This argument is valid, and special operations should continue to attract technically savvy personnel into its organization. Reliance on enablers, however, does not address two enduring issues. First, teams do not always deploy with their enablers to the theater of operation and special operations teams may find themselves separated from their AI enabler. While AI experts could potentially troubleshoot issues from afar, access and connectivity to forward-deployed teams are not guaranteed. In these situations, a member of the special operations team should be able to assess and mitigate any issues with the AI systems they are using.
The second issue is that demand for AI talent is widespread across both the Department of Defense and other industries. U.S. Army Special Operations is exploring changes in force structure and a new warrant officer career field to add technology-oriented roles and enable tactical-level “tech integration,” but there are simply not enough technical experts to fill these enabler roles for all tactical teams. Furthermore, institutional constraints such as limitations on personnel can hinder efforts to hire and train AI enablers. The integration of advanced AI technologies into military operations will likely occur faster than the special operations force’s ability to grow a new cadre of AI experts. While AI becomes a part of nearly all defense applications, the military will likely retain AI expertise at higher levels rather than at the company, detachment, or team level due to AI talent shortages in the foreseeable future. Given this plausible scenario, special operations forces should invest in training and educating its existing workforce to address this gap in AI knowledge.
The military has access to more talent than it may realize. Although it would be ideal for an individual with a formal science, technology, engineering, or math background to take on more advanced AI knowledge, a lack of a technical degree should not be a disqualification. With the right level of guidance, training, and exposure to technology, individuals without a technical degree can gain enough knowledge to identify and troubleshoot issues of a particular AI application. A trained operator should supplement, not supplant, a technical expert, particularly when access to expertise is not readily available. Hiring only those with technical backgrounds to fulfill these roles would limit special operations forces’ ability to generate AI skills at the team level, which is a growing necessity as AI becomes an indispensable part of how teams plan and conduct operations.
Conclusion
The excitement over AI is palpable. From the “hyper-enabled operator” working at the tactical edge to the analyst assessing social dynamics and trends within the operational environment, AI is touted as a transformative technology that will impact every facet of warfare. Yet, broad adoption of the technology is still in its embryonic stages. If special operations wants to be adequately poised for an AI-driven future, it should recognize the need to build a cohort of operators who are able to identify and diagnose the more complex issues posed by AI.
While this article focuses on the special operations community, this recommendation should not be considered a special operations-exclusive endeavor. Other units should also explore ways to generate more advanced AI proficiency, down to the lowest level possible, to increase the resiliency of the force as AI becomes an integral part of a warfighter’s kit. Attracting technical talent to the Department of Defense remains an important priority, but the reality in the near term is that the military is unlikely to fill tactical-level teams with these experts. Hence, the military should look to its existing talent pool to augment this expertise. Ultimately, understanding one’s own tools and equipment is the responsibility of the individual warfighter.
Kelley Jhong is a U.S. Army Psychological Operations officer with operational experience in the Indo-Pacific region. A recent graduate of the Naval Postgraduate School, she researched how special operations forces should evaluate AI for operations in the information environment.
The views expressed are those of the author and do not reflect the official position of the Naval Postgraduate School, the U.S. Special Operations Command, the U.S. Army, the U.S. Department of Defense, or any other entity within the U.S. government.
Image: Alexander Gago