JOINT FORCE QUARTERLY 110

3rd Quarter, July 2023


News | July 7, 2023

An AI-Ready Military Workforce

By Iain Cruickshank Joint Force Quarterly 110

Download PDF

Dr. Iain Cruickshank is a Senior Researcher at the United States Military Academy at West Point.
Soldiers check Nett Warrior end-user devices during Army Expeditionary Warrior Experiment force-on-force field demonstration held on Fort Moore (formerly Fort Benning), Georgia, March 4, 2021 (U.S. Army/Jason Amadi)

Much recent professional military writing, such as the National Security Commission on Artificial Intelligence’s Final Report, stresses the need for an artificial intelligence (AI)-ready workforce.1 AI has the distinct potential for creating a battlefield advantage for whichever warring party can best harness the technology, making an AI-ready military workforce imperative to gaining that advantage.2 Thus, while it is generally clear that the military needs an AI-ready workforce, what that should actually mean is less clear.

Most commentators in this area vaguely suggest “AI experts in uniform” as the solution to an AI-ready workforce for the military.3 Recent work has indicated that there are distinct roles in the production of AI as well as distinctive training needs for different roles.4 Additionally, commentators have pointed out the need for some level of understanding of AI for senior leaders, acquisitions personnel, and users of AI-enabled systems.5 Despite the recent scholarship identifying different relationships to AI within the workforce, there is no unifying model of an AI-ready workforce that considers such needs as the scale of the different parts of the workforce. AI workforce proposals to date only consider creating an AI-enabled system (for instance, running an AI project, creating a model from scratch) or running a full data science project. Moreover, they ignore more realistic uses of AI in military settings, which include tasks such as maintaining and adjusting models to changes in the operational environment.

In this article, I argue that an AI-ready workforce for the military should be built around an AI skills-in-depth model that:

  • creates gradations of AI technical skills that address the actual demands AI-enabled technologies will place on a military force
  • focuses on educating leadership and the acquisitions community on recognizing opportunities to use AI and evaluating AI capabilities
  • prioritizes creation of lower skilled technicians in uniform over creating higher skilled AI experts in uniform.

Before exploring the proposed model for what an AI-ready workforce looks like for a military Service, it is important to clarify a few points about the use of AI. First, AI-enabled systems require maintenance. Machine learning algorithms, which are at the heart of an AI-enabled system, suffer from many issues, including model drift, changes in the data generation environment, issues with models being deployed in real life, and newer, better models coming out.6 These inherent issues with AI-enabled systems mean that they will require periodic maintenance, updating, and monitoring for changes in model performance or data input to continue to be useful. Second, the application of AI requires careful consideration of the problem. AI is not a catch-all that can solve any problem. AI-enabled systems typically need to be tailored to a specific problem, which requires thought about what problems are amenable to AI solutions and how to implement those solutions in a way that works for the organization.7 Third, AI will often come as part of a larger integrated system. The actual machine learning that makes up any AI-enabled system is typically one relatively small component, which is commonly just one component of a larger system, like the autonomous threat recognition algorithm for a mobile autonomous platform.8 When using an AI-enabled system for a real-world problem, it is important to remember that that system will require maintenance and that machine learning models will only be narrowly applicable to a given problem.

From these fundamental observations, we can deduce the rough outlines of what AI will look like in the military, even if particular details are missing. AI will be present in many, if not most, battlefield systems, from vehicles to mission command suites, and built in as core components of those battlefield systems by defense contractors. All of these AI-enabled systems and their associated machine learning models will require maintenance, at least some of which will need to be conducted by uniformed personnel. There will also likely be a need for ad hoc data science and AI solutions created within military units to support a particular commander or battlefield problem. Thus, interactions with AI-enabled systems will be predominantly confined to the user level, followed by much fewer maintenance types of interactions, and very few design-and-implement kinds of interactions.

Outline of an AI-Enabled Military Workforce

Given the real-world demands of using AI in the military, the best way to create an AI-ready workforce is to follow an AI skills-in-depth model of training and education. This model must economize resources while also producing a military workforce that can actually harness the battlefield advantages offered by AI. While no part of the model is sufficient to create an AI-enabled workforce, each part addresses a necessary component, and when combined they are sufficient to achieve the desired endstate. The model’s fundamental dynamic can be summarized as exponentially decreasing the numbers of military workforce members in work roles as we increase the AI technical skills required for those work roles. This decrease is done for two primary reasons. First, as the level of expertise in AI technical skills increases, the “cost” to create proficiency with those skills increases exponentially. Second, this model will decrease the number of Servicemember interactions with the AI-enabled systems that require specialist AI technical skills. Figure 1 summarizes the model and its different components. Each of the components (that is, users, AI technicians, and so forth) is described in detail in the table.

Figure 1. AI Skills-in-Depth Model

 

Table. Summary of the Different Layers of the Expertise-in-Depth Model for an AI-Ready Military Workforce

Another way of thinking about the AI skills-in-depth model is by the relative amount of time members of each of the components spend on hands-on work using AI skills. For example, at the user level, the hands-on AI technical work will largely consist of being aware of when the AI-enabled system is not working properly. This means that little of their working time will be spent on hands-on AI-technical work, whereas an AI technician or functionary, who will have to perform hands-on AI technical tasks (such as fine-tuning models, checking a model’s performance against new data, checking data integrity), will need significantly more time to perform those tasks (perhaps equating a second job or additional duty). Figure 2 displays the dynamic of the amount of working time needed to perform AI technical skills as part of the job.

Figure 2. Relative Amount of Time Spent Performing AI-Technical Tasks

This model closely resembles what is already in place in various military communities. One example is the military medical community; the U.S. Army trains all of its personnel on emergency medical procedures. This type of training is roughly analogous to what is needed within the AI users’ component. On the battlefield, the Army has medics at the unit level providing limited emergency (tactical casualty care) medical care. The next level is the aid station, possibly staffed with a physician assistant and registered nurse, both of whom possess greater medical expertise and require more medical education and training. They are capable of the next level of medical care and getting the patient stabilized. These individuals and their respective levels of skills are roughly analogous to the AI technicians and functionaries when it comes to working on an AI-enabled system. Eventually, the casualty may get transported to a full trauma center to receive lifesaving surgery, which is performed by surgeons, who require more medical education and training than the previous layers. These individuals are roughly analogous to those individuals in the AI experts component. A layered approach to functional expertise is already extant in some military functions, like military medicine.

More concretely, the model consists of five different components—users, leaders and acquisitions experts, technicians, functionaries, and experts—of AI training and education that differ in their hands-on AI technical skills and scope of interaction with military AI-enabled systems. These components, when combined, allow for a robust and realizable AI-ready workforce that can meet all the demands that incorporating AI into warfighting will place on the workforce. The table summarizes the different components of the AI skills-in-depth model.

Given the predicted profusion of AI-enabled systems and equipment on the battlefield, it is likely that most military members will have to interact with AI-enabled technology, and most interactions with AI-enabled technologies will occur at the user level.9 Thus, it is necessary to train the workforce on how to properly use their AI-enabled technologies so that users trust their equipment and can effectively and ethically use it. To achieve these effects, this training should naturally include some instruction in the high-level concepts of the technology powering the system, like machine learning. Training will also need to include the skills to detect/identify when the technology is not functioning properly. However, malfunctioning AI-enabled technologies will be, to a great degree, application-specific (that is, Google Maps malfunctions for different reasons than a detection model in a digital camera). Something like new equipment training, which is part of the standard fielding process for the Army, would be a good place to incorporate this type of user-level training.10 Other forces outside of the United States have also similarly recommended and outlined training for users of AI-enabled systems.11 Generally, the proposed training of this layer only requires basic knowledge of AI. Users practice within their respective fields; the practice of that field could be improved by using AI-enabled technologies but does not require any hands-on technical work in AI.

The next component in the model consists of the military leaders and the acquisitions experts of the workforce. This education is meant to bring leaders a big-picture understanding of AI function and some of its technological applications to best identify problems that are amenable to AI solutions. To successfully utilize AI-enabled technologies in military operations, just like any other combat enabler, a military leader must possess sufficient knowledge of the enabler. Introducing education on AI into intermediate and senior Service college curriculums would accomplish this. The Army’s Military Intelligence Center of Excellence is already pioneering training of this type for their warrant officer advanced course wherein students are given a high-level overview of machine learning, what it looks like when AI-enabled systems go wrong, and the military intelligence functions in which students may come across these AI-enabled technologies.12 The course instructors also challenge students to identify a problem in their own workflows that could be addressed by an AI-enabled solution and how they could plan to implement that solution. Within the joint community, the chief digital and artificial intelligence office is currently experimenting with a “Lead AI” course that pursues similar goals and strives to create awareness of AI capabilities for senior leaders.13 Training leaders so that they know what AI can provide and challenging them to think about what functions or roles they perform that could benefit from AI will greatly speed the creation of an AI-ready military.

Additionally, since the design and production of AI-enabled technologies continue to be the domain of defense contractors, it is important for personnel involved in the acquisitions process to possess appreciable AI knowledge. Since civilian AI experts will not necessarily understand the military problems that they will build AI solutions for, and military personnel may not necessarily understand the AI technology, these personnel need to bridge that gap. It is vital to the health of the force that acquisitions personnel be able to evaluate proposed solutions and ensure AI is properly incorporated into military systems. Other commentators have remarked on this need for AI training for acquisitions personnel,14 and there has been some recent work outlining AI-specific checks for military projects in the development phase.15 While this layer of the AI-enabled workforce could benefit from some practice and expertise in AI, neither of these two workforce functions requires that these personnel be AI practitioners to carry out their respective organizational functions.

It should also be noted that there is considerable complexity in terms of processes and roles within the military’s acquisition workforce and that the need for AI technical expertise will likely vary significantly across the acquisitions enterprise. For example, individuals involved in testing and evaluating a possible new system will likely require more AI technical skills than those involved in project management or contracting. The acquisitions component in this model is meant to apply to the more major and generic functions of acquisitions.

Joint Department of Defense team executed 12 artificial intelligence flight tests in which AI agents piloted X-62A Variable Stability In-Flight Simulator Test Aircraft, seen here in an August 26, 2022, photo, to perform advanced fighter maneuvers at Edwards Air Force Base, California, December 1–16, 2022 (U.S. Air Force/Kyle Brasier)

The AI technicians component is comprised of individuals who are primarily responsible for maintaining AI-enabled systems, which will require maintenance of their machine learning models and data pipelines. This maintenance will require some hands-on (but not expert level) AI technical skills. Students will require hands-on experience with machine learning–related skills, like model fine-tuning, and running AI enablers, like cloud instances. The Army’s Artificial Intelligence Integration Center is set to begin the third iteration of its AI Cloud technician’s course, which serves as a good starting place for this technician-level of training and education.16 Students in the course are taught Python programming, along with cloud administration and some basic skills in modification of machine learning models. Following the classroom instruction, students have a utilization tour wherein, ideally, they can further hone their skills. While this program is a good start, these technician programs will likely need to be expanded and focused around certain maintenance functions of AI-enabled systems in the future, to include machine learning model maintenance and data curation. The chief digital and artificial intelligence office has also highlighted a worker archetype, “Embed AI,” which would cover this role as well (although it does not appear to have any training associated with the role).17 At the technician’s layer, the workforce will need education that includes hands-on practice with the maintenance aspects of AI.

Closely related to AI technicians are AI functionaries. The maintenance of AI-enabled systems will occasionally require more detailed skills in larger, more complex machine learning operations at higher echelons.18 There will also be the need for ad hoc and customized data science and AI solutions to specific unit and battlefield problems. Some units, such as the 513th Military Intelligence Brigade, have already experimented with this concept by having a unit data scientist officer who can deliver quick simple machine learning solutions to unit problems.19 At this layer, students will need not only a greater depth of hands-on technical skills than at the previous layer but also a greater breadth of knowledge across more elements of an AI-enabled system. This type of work will likely require experiential learning that can only be imparted at this time by a higher level education program. As an example, the Army’s Artificial Intelligence Integration Center is running its second iteration of the AI scholars’ program.20 Army company-grade officers are sent to graduate school to obtain a master’s degree in an AI-relevant field, followed by a utilization tour with the Artificial Intelligence Integration Center to, ideally, further refine and practice their skills. The U.S. Air Force produces similar results with its Air Force Accelerator program.21 At this layer, the workforce will need both more breadth and depth of practiced skills in AI; however, there will likely be relatively few interactions that will need this level of skills within a military organization.

Then, there are the experts in AI: the professionals who are dedicated to practicing AI, with a high level of education and practical experience in their relevant AI fields. Their profession is exclusively doing AI. They are also very expensive to produce, not only from the educational perspective, because they often require top-level degrees, but also from the investment of time in their practice. Furthermore, to really be able to grow, retain, and employ these individuals, even at a basic level, the military would have to significantly change its manning practices, as has been outlined in the National Security Commission on Artificial Intelligence’s Final Report and argued by other authors.22 Because there are relatively few interactions with AI-enabled military systems that require a true expert, experts can fall out of practice with critical skills. This is costly both because of the initial investment in such specialized skills and then the loss of those skills from disuse. Thus, while experts are absolutely needed, the force should prioritize using fewer experts more effectively until the demands of AI-enabled warfare grow and battlefield experience can clarify where investments in expertise are needed.

It is important for military decisionmakers not to become fixated on having the best-of-the-best AI practitioners at the expense of having broad exposure to AI skills in uniform. Finally, it is also worth pointing out, as other commentators have,23 that a method of service like Component 3 (Army Reserve) units might be more conducive to growing AI experts for the military workforce than other modes of service, like Component 1 (Active Duty). The 75th Innovation Command is a Component 3 unit assigned to the Army Futures Command that would be a good place to grow AI experts. Most Component 3 personnel also have a civilian career, and some might already work in science, technology, engineering, and mathematics fields to include AI/machine learning. Reserve Component service, combined with enablers like remote work, presents the ability for AI experts to largely stay practitioners in their fields, but the military establishment still has the ability to leverage them when an AI expert is actually needed.

Finally, while there is a certain hierarchy present in the model in terms of the number of people and time spent doing hands-on technical work in the model, the skills needed for each component do not necessarily overlap. For example, a skill such as fine-tuning a pretrained model will be shared by AI technicians and all the components above that component (AI functionary, AI expert), but other skills, like strategic planning for AI employment or project management, do not translate up the hierarchy. The hierarchy present in the model also does not necessarily imply level of expertise as well. For example, an AI technician could be an expert at fine-tuning computer vision models, while an AI expert in something like reinforcement learning models may have only a basic level of expertise. While expertise and skills generally increase as one moves up the hierarchy in the model, this is not always the case.

U.S. Central Command Chief Technology Officer Schuyler Moore (left) and Army Sergeant Mickey Reeves, winner of U.S. Central Command’s 2022 Innovation Oasis, conduct press briefing on artificial intelligence and unmanned systems at Pentagon, Washington, DC, December 7, 2022 (DOD/Alexander Kubitza)

Closing Thoughts

The best starting point to create organizational change toward achieving an AI-enabled workforce would be to start with the education and training for leadership and acquisitions. This level of education should also be combined with realistic experimentation exercises and wargaming on how to employ proposed or possible AI-enabled systems. Some of this occurs already with XVIII Airborne Corps’ AI-enabled live fire exercises and Army Future Command’s Future Study Program.24 Additionally, it is critical for the acquisitions personnel, who are responsible for “buying” all the AI-enabled technology, to obtain AI-enabled systems that can both meet warfighter needs and be used and maintained by Servicemembers. After that, as AI-enabled technologies begin to be distributed across the force, it will be important to prioritize user-level and maintenance-level training. Finally, while most of the examples in this article come from an Army perspective, the model and its associated roles and observations should generally apply to any military Service.

A key component of a revolution in military affairs is the ability of a military force to successfully incorporate new technologies into operations, training, doctrine, and other military processes.25 The advantages of AI will come to the military that can best employ it.26 To realize the potential groundbreaking value of AI technology, military organizations must work toward creating an AI-enabled workforce. The creation of this workforce should be based on the nature of AI in the military rather than an obsession with expertise or defaulting to AI experts due to lack of knowledge about AI. As such, I advocate for an AI skills-in-depth model that decreases focus on creating AI experts, which is both costly and—given integrated AI warfighting has not fully arrived—not yet necessary en masse, as their skills would just atrophy. Creating an AI-enabled workforce requires more than just training AI experts and hoping AI will deliver revolutionary effects on the battlefield. JFQ

Notes

1 Final Report (Washington, DC: National Security Commission on Artificial Intelligence, 2021), 119–131, https://www.nscai.gov/wp-content/uploads/2021/03/Full-Report-Digital-1.pdf.

2 Michael Raska and Richard Bitzinger, “Artificial Intelligence: A Revolution in Military Affairs?” Singapore Defence Technology Summit, June 26–28, 2019, https://www.dsta.gov.sg/docs/default-source/documents/190625_tech-summit-commentary_a-revolution-in-military-affairs.pdf.

3 José-Marie Griffiths and Justin Lynch, “How to Build a Well-Rounded AI Workforce,” National Defense, October 21, 2020, https://www.nationaldefensemagazine.org/articles/2020/10/21/how-to-build-a-well-rounded-ai-workforce.

4 Diana Gelhaus and Santiago Mutis, The U.S. AI Workforce: Understanding the Supply of AI Talent, CSET Issue Brief (Washington, DC: Center for Security and Emerging Technology, January 2021), https://cset.georgetown.edu/publication/the-u-s-ai-workforce.

5 Griffiths and Lynch, “How to Build a Well-Rounded AI Workforce”; Joe Chappa, “Trust and Tech: AI Education in the Military,” War on the Rocks, March 2, 2021, https://warontherocks.com/2021/03/trust-and-tech-ai-education-in-the-military.

6 Marianne Bellotti, “Helping Humans and Computers Fight Together: Military Lessons from Civilian AI,” War on the Rocks, March 15, 2021, https://warontherocks.com/2021/03/helping-humans-and-computers-fight-together-military-lessons-from-civilian-ai.

7 Veljko Krunic, Succeeding with AI: How to Make AI Work for Your Business (New York: Manning, 2020).

8 D. Sculley et al., “Hidden Technical Debt in Machine Learning Systems,” Conference and Workshop on Neural Information Processing Systems, 2015, https://proceedings.neurips.cc/paper/2015/file/86df7dcfd896fcaf2674f757a2463eba-Paper.pdf.

9 Zachary S. Davis, Artificial Intelligence on the Battlefield: An Initial Survey of Potential Implications for Deterrence, Stability, and Strategic Surprise (Livermore, CA: Center for Global Security Research, Lawrence Livermore National Laboratory, 2019), https://cgsr.llnl.gov/content/assets/docs/CGSR-AI_BattlefieldWEB.pdf.

10 Keith Jordan, “All Together Now,” U.S. Army, August 20, 2014, https://www.army.mil/article/132125/all_together_now.

11 Marigold Black et al., Supporting the Royal Australian Navy’s Campaign Plan for Robotics and Autonomous Systems: Human-Machine Teaming and the Future Workforce (Canberra: RAND Australia, 2022), chap. 3, https://www.rand.org/content/dam/rand/pubs/research_reports/RRA1300/RRA1377-2/RAND_RRA1377-2.pdf.

12 Eric Holder, email correspondence with author, July 2021.

13 DOD AI Education Strategy: Cultivating an AI Ready Force to Accelerate Adoption (Washington, DC: Joint Artificial Intelligence Center, 2020), https://www.ai.mil/docs/2020_DoD_AI_Training_and_Education_Strategy_and_Infographic_10_27_20.pdf.

14 Griffiths and Lynch, “How to Build a Well-Rounded AI Workforce.”

15 Bruce Nagy, “Tips for CDRLs/Requirements When Acquiring/Developing AI-Enabled Systems,” Proceedings of the Nineteenth Annual Acquisition Research Symposium, May 2, 2022, https://dair.nps.edu/bitstream/123456789/4587/1/SYM-AM-22-074.pdf.

16 Army Futures Command, “Recruiting Window Now Open for Next AFC Software Factory, Cloud Technician Cohorts,” U.S. Army, June 7, 2021, https://www.army.mil/article/247292/recruiting_window_now_open_for_next_afc_software_factory_cloud_technician_cohorts.

17 DOD AI Education Strategy.

18 Harshit Tyagi, “What Is MLOps—Everything You Must Know to Get Started,” Towards Data Science, March 25, 2021, https://towardsdatascience.com/what-is-mlops-everything-you-must-know-to-get-started-523f2d0b8bd8.

19 Alexandra Farr and Bre Washburn, email correspondence with author, February 2022.

20 “CMU Partners with U.S. Army To Grow Data Science and AI Expertise,” Carnegie Mellon University, September 1, 2020, https://www.cmu.edu/news/stories/archives/2020/september/army-partners-grow-data-science.html.

21 “DAF-MIT AI Accelerator,” Department of the Air Force and Massachusetts Institute of Technology, 2023, https://aia.mit.edu.

22 Final Report, 119–131.

23 Griffiths and Lynch, “How to Build a Well-Rounded AI Workforce.”

24 Dan Roper, “Special Edition: Army Future Study Program,” interview with Stephanie Ahern and Jean Vettel, Association of the United States Army’s Army Matters podcast, March 29, 2021, https://podcast.ausa.org/e/special-edition-army-future-study-program.

25 Williamson Murray and Macgregor Knox, “Thinking About Revolutions in Warfare,” in The Dynamics of Military Revolution, 1300–2050, ed. MacGregor Knox and Williamson Murray (Cambridge: Cambridge University Press, 2001), 175–194.

26 Paul Scharre, Robotics on the Battlefield—Part 1: Range, Persistence, and Daring (Washington, DC: Center for a New American Security, May 2014), https://www.jstor.org/stable/resrep06404.