Download PDF
Dr. Kelly John Ward is Professor of Strategy and Policy in the National War College at the National Defense University.
Changing the curriculum of any senior Service college (SSC) is never taken lightly. Over time and with care and experience, the commandant, dean, and associate deans craft a well-balanced mix of operational and strategic topics to best prepare their students for future senior leadership positions. The most precious resource at an SSC is time on the academic calendar. Every assigned reading, every lecture by an expert or senior leader, and every seminar the students participate in is valuable because of the opportunity cost. In a constrained 10-month master’s degree program that must meet and excel at the tasks stipulated in the joint professional military education (JPME) program for the Process of Accreditation of Joint Education requirements, an outcomes-based military education has no room for extraneous material.1 SSC curricula are a delicately balanced mix of subjects, discussions, applied thinking and exercises, and student evaluations. The unfortunate reality is that adding important topics or material to the SSC curriculum requires removing equally important material—and often upsets the delicate balance that has built over time.
A recent Joint Chiefs of Staff (JCS) publication, Developing Today’s Joint Officers for Tomorrow’s Ways of War (JCS PME Vision 2020), provides the impetus for a curriculum assessment at the National War College (NWC).2 More specifically, the dean of the NWC asked me to develop and potentially integrate material into the NWC curriculum to address the following points in the JCS PME Vision 2020:
Our leader development enterprise demands a . . . deeper understanding of the implications of disruptive and future technologies for adversaries and ourselves; JPME programs must provide graduates the initial knowledge and skills to prepare them for service as warfighting joint leaders, senior staff officers, and strategists who . . . anticipate and lead rapid adaptation and innovation during a dynamic period of acceleration in the rate of change in warfare under the conditions of Great Power competition and disruptive technology.3
By their very nature, disruptive technologies are uncertain, but they are not always unpredictable. The unclassified version of the 2018 National Defense Strategy identified the following list as areas where rapid technological advancement could change the character of war:
- advanced computing
- “big data” analytics
- artificial intelligence (AI)
- autonomy
- robotics
- directed energy
- hypersonics
- biotechnology.4
There are of course other emerging technologies that have the potential to change both the character of war and the larger economic competition among powerful nations. This article describes one potential solution to the question of which emerging technologies should potentially be integrated into NWC’s curriculum, and to what degree, for just one of the many Department of Defense (DOD) graduate institutions and SSCs. It recognizes that there are many possible solutions to senior leader development in emerging technologies and that the preferred solution will vary from school to school based on the current curriculum, faculty expertise, degree focus, and other factors. The intent of this article is to add to the discussion and provide a logical baseline for how one SSC addressed the imperative to “provide graduates the initial knowledge and skills to prepare them for service as . . . strategists who . . . anticipate and lead rapid adaptation and innovation . . . under the conditions of great power competition and disruptive technology.”5
Which Emerging and Disruptive Technologies to Teach?
Emerging technology is a term generally used to describe a new technology, but it may also refer to the continuing development of an existing technology. Emerging technology can also have a slightly different meaning when used in different areas, such as business, science, education, or national security. For example, DOD has always been focused on developing emerging military technologies to enhance national security and maintain superiority over potential competitors.
In 2020, the top 10 emerging technologies, according to the CompTIA Emerging Technology Community, were:
- AI
- 5G (fifth-generation technology standard for broadband cellular networks)
- Internet of Things
- serverless computing
- biometrics
- augmented reality/virtual reality
- blockchain
- robotics
- natural language processing
- quantum computing (QC).6
Disruptive technology, alternatively, is an innovation that significantly modifies the way that consumers, industries, businesses, or the military operate. A disruptive technology quickly devastates the systems or habits it replaces because it has attributes that are recognizably superior. Recent disruptive technology examples include e-commerce, online news sites, ride-sharing apps, and global positioning systems. At one time, the automobile, electricity service, television, and atomic weapons were considered disruptive technologies.
In 2010, the Committee on Forecasting Future Disrupting Technologies wrote:
New technologies continue to emerge in every field and in [every] part of the world. In many cases, when a technology first emerges, its disruptive potential is not readily apparent. It is only later, once it has been applied or combined in an innovative way, that the disruption occurs. In other cases, however, a disruptive technology can truly be the result of a scientific or technological breakthrough. Some of these technologies are specific and target a niche market, while others possess the potential for widespread use and may open up new markets. A disruptive technology may change the status quo to such an extent that it leads to the demise of an existing infrastructure. Accordingly, three important questions should be asked about emerging technologies: Which of them could be considered latently disruptive? In which sector, region, or application would the technology be disruptive? What is the projected timeline for its implementation?7
The Congressional Research Service recently analyzed current emerging military technologies that include AI, lethal autonomous weapons systems, hypersonic weapons, directed energy weapons, biotechnology, and quantum technology.8 Comparing this list with the CompTIA list of emerging commercial technologies, we see two areas of overlap: AI and QC technology. For the NWC curriculum—with its focus on Great Power competition and emphasis on all national elements and instruments of power—AI and its subfield, machine learning (ML), seemed like the area of emerging technology on which to potentially focus. Further research and analysis supported this initial intuition.
To meet the purpose of the JCS PME Vision 2020 for the NWC curriculum, the following two learning objectives were developed:
- understand the vocabulary and concepts behind the emerging (and potentially disruptive) technologies of AI and ML
- understand the current and potential future applications and capabilities, as well as some of the limitations and concerns, of AI and ML.
AI/ML in an SSC Curriculum
AI and ML are upon us. Information on AI is flooding the market, media, and social channels. Former Secretary of Defense Mark Esper highlighted AI as one of DOD’s top 11 modernization initiatives.9 In 2018, DOD created the Joint Artificial Intelligence Center (JAIC) to coordinate efforts to use ML and other AI to maintain a lethality and efficiency edge over other nations’ militaries. Without a doubt, AI and ML are topics worth the attention of future strategic leaders.
It can be difficult to sift through the media hype and grandiose promises of AI firms to understand exactly how AI/ML could be applied in practical and reliable ways. Of course, incorporating new technology into governmental or commercial processes requires significant leadership and effective direction that all stakeholders can easily understand.
Many of our daily human experiences and interactions involve machines or devices that are already using AI of some sort. Examples include a Google search, ride-sharing apps such as Uber or Lyft, email spam filters, banking and credit card fraud prevention, and online shopping searches. AI/ML technology is an integral part of our lives already, and its ubiquity will only increase. Strategic leaders will be called on to evaluate how we can better use the strengths of AI—while acknowledging its weaknesses—to augment our ability to defend our national interests.
Advances in ML have allowed us to create systems that can automate complex tasks through constant learning. Computers have always been able to assist and make assessments about the world based on information we provide to them. But we have evolved beyond telling these machines what to do with our data. Now machines can learn from patterns and anomalies they find in data on their own. These are patterns and anomalies that our minds cannot feasibly find due to the sheer size and complex intricacies that exist within the data. AI’s strength comes from its ability to analyze large volumes of data reliably, efficiently, and accurately, and without fatigue.
However, AI/ML does not understand strategy. It is constrained to a specific task, which it executes in an efficient manner. Its ability to learn and provide insights is limited in scope. It still requires humans to take those insights and determine what role they will play in a larger strategy that accomplishes the identified objectives. If DOD and the Intelligence Community can harness the strengths of AI—and autonomy, a major area of application of AI—while acknowledging the weaknesses, then national security professionals can use these technologies to better achieve future success.
Resources
In 2018, at the urging of the Pentagon’s Defense Innovation Board, JAIC was stood up.10 Working within DOD’s Chief Information Office, JAIC seeks “to transform the DOD by accelerating the delivery and adoption of AI to achieve mission impact at scale.”11 Part of JAIC’s holistic approach is to “cultivate a leading AI workforce.”12 JAIC’s chief AI architect, Nate Bastain, spearheaded an effort in the summer and fall of 2020 to develop multiple “AI communities of interest” across DOD, especially within educational institutions, such as the Service academies as well as PME and JPME organizations. His efforts succeeded in creating the DOD-affiliated Graduate Institution Artificial Intelligence Community of Interest (Graduate AI COI), bringing together interested faculty, researchers, and leaders from across the SSCs and all other master’s degree–granting institutions in DOD. The Graduate AI COI has been extremely valuable in this process of analyzing AI/ML as a potential addition to the NWC core curriculum. Thomas Linn from the NWC was generous in sharing his perspective on what SSC students should know about AI/ML and QC. All SSC representatives have shared what their institutions currently teach their students, mostly via electives or guest speakers.
For academic year 2021, only the National Defense University’s (NDU’s) College of Information and Cyberspace (CIC) has a block of instruction in its core courses about AI/ML and QC. CIC’s Linda Jantzen was especially generous in providing the curriculum that CIC teaches its students, which provided a great starting point for this analysis. The bottom line here is that JAIC’s Graduate AI COI is a tremendous resource, full of professionals, researchers, and educators with expertise and interest in AI/ML.
The next potential resource for any SSC considering adding AI/ML to its core curriculum is appropriate guest speakers. The advantages of having a true AI professional who can bring excitement and valuable knowledge about AI to our mature student body cannot be overstated. Speakers who combine a genuine understanding of AI with experience and expertise in national security are especially valuable.13
Some Readings to Consider
What should SSC students read about AI/ML as an emerging and potentially disruptive technology that will prepare them as future warfighting joint leaders, senior staff officers, and strategists? The students will vary greatly in their prior knowledge of AI, as is the case with many topics we teach. (Some students may even boast advanced degrees in AI-related subjects or have prior assignments in AI-related fields or acquisition.) However, it should be assumed that the typical SSC student (and faculty member) is at a low level of prior knowledge about AI and ML. From the literally thousands of available books, articles, videos, and Web sites about AI and ML, what should be the selection criteria for materials? The NWC chose material that was a balanced combination of informative, national security–related, not purely commercial in application, and straightforward and efficient at imparting the information. Create and adhere to a well-considered and comprehensive catalogue of the concepts, ideas, arguments, and counterarguments to which to expose SSC students. (The selected annotated bibliography at the end of this article includes only a small sample of what is available.) More important, the catalogue of concepts, ideas, arguments, and counterarguments that the NWC selected as the most relevant for our students will almost certainly not match the SSC’s catalogue of important AI/ML concepts. Employ the expertise that already exists within educational institutions when considering adding (or expanding) emerging and disruptive technologies to the SSC curriculum.
Seminar Discussions
The Socratic seminar that is the centerpiece of the pedagogy at the NWC and other SSCs will be vitally important to meet the JCS PME Vision 2020 goal of creating leaders with a “deeper understanding of the implications of disruptive and future technologies for adversaries and ourselves.”14 Guest speakers and selected readings will introduce the current evolution of AI/ML and its challenges, limitations, and vulnerabilities, and our Great Power rivals’ emphasis on quick development of AI for both economic development and military dominance.15 The faculty seminar leaders will have to guide the discussion and debate toward the larger strategic issues of the advantages to being first to develop a disruptive technology. Encouraging students to think and debate—and logically defend—their thoughts and potential biases on the larger issue of AI as a disruptive technology is the goal.
Potential questions for inclusion in a seminar include:
- Could AI/ML advances truly disrupt? Which applications would be most disruptive, and over what time frame?
- To what level do strategic leaders need to understand AI and ML—and other emerging/disruptive technologies—to be effective decisionmakers?
- Commercial businesses and big tech firms are at the forefront of AI and ML research. How can governments and national security agencies possibly benefit from these advances?
- Which application of AI provides the most potential, either commercially or from a national security perspective? Why?
- Which AI vulnerability or safety issue (for example, brittleness, unpredictability, bias, ethical) is of most concern? Why?
- What strategic-level actions should the United States—and our allies—take today and in the near future to ensure that China does not gain tactical, operational, or strategic advantage with AI systems, autonomous capabilities, or decision support systems? What would be the estimated costs of these U.S. actions, and what would be the potential budget tradeoffs?
The JCS PME Vision 2020 states, “JPME programs must provide graduates the initial knowledge and skills to . . . anticipate and lead rapid adaptation and innovation . . . under the conditions of great power competition and disruptive technology.”16 Some deans and associate deans assert that the described one-two lesson block of readings, guest speakers, and focused seminar discussion fulfills the JCS’s intent, using AI/ML as a prime example of a disruptive technology that is currently relevant to strategic leaders. A complementary benefit of incorporating these AI concepts into the curriculum is that doing so creates an additional opportunity to familiarize students with innovation and leadership through change, as specified in the JCS PME Vision 2020: “Anticipate and lead rapid adaptation and innovation during a dynamic period of acceleration in the rate of change in warfare under the conditions of Great Power competition and disruptive technology.”17
An argument can also be made that the students should conduct, after discussing and debating AI/ML information in seminar, a collaborative exercise of some type. This experiential learning task could be centered around the questions posed in the National Research Council’s Persistent Forecasting of Disruptive Technologies:
Which of the AI/ML technology applications would you consider to be the most latently disruptive for national security? Why? What is the projected timeline for its implementation, either by the United States and our allies, or by a strategic competitor? What actions and budget decisions should the U.S. and DOD be considering now to offset the risks or take advantage of the rewards when your selected disruptive application of AI/ML becomes a reality?18
Such exercises, however, take a significant amount of limited academic time. The leaders ultimately responsible for the SSC curriculum will need to carefully weigh the benefits of an AI/ML exercise against the opportunity costs to the other topics in the academic program.
AI and ML are not going away. The SSCs and other JPME institutions, working with each other and with organizations such as JAIC and the National Security Commission on Artificial Intelligence, can better prepare our future senior leaders by integrating AI/ML into our Great Power competition–focused curricula. Leveraging professional military education to teach our students baseline knowledge and skills, and how to think about these disruptive technologies, will be critical to our nation’s future economic and security success. JFQ
Selected Annotated Bibliography
“Artificial Intelligence and Its Limits: Reality Check.” The Economist, June 11, 2020, 3–12. Describes AI limitations in detail, including the scarcity of AI talent and programmers, high cost of hardware, autonomy and failures, opaqueness, and black box ethical issues.
Bergstein, Brian. “What AI Still Can’t Do.” MIT Technology Review (February 19, 2020). AI systems do not understand cause and effect and thus lack common sense. These systems could be easily fooled and prove untrustworthy in complex environments.
Boulanin, Vincent et al. “Artificial Intelligence: A Primer.” In The Impact of Artificial Intelligence on Strategic Stability and Nuclear Risk, vol. 1, Euro-Atlantic Perspectives, edited by Vincent Boulanin, 13–22. Stockholm: Stockholm International Peace Research Institute (SIPRI), May 2019. Offers basic introduction and AI vocabulary. Concentrates on narrow versus general AI. Discusses AI “hype cycles” and “AI winters.” Further considers machine learning, deep learning, and autonomy, as well as data management and opaque and brittleness challenges.
Davis, Zachary. “Artificial Intelligence on the Battlefield: Implications for Deterrence and Surprise.” PRISM 8, no. 2 (October 2019), 115–128. Examines the potential risks and rewards of military applications of AI and the potential consequences of AI on strategic stability and deterrence. Also considers unintended consequences, risk, and potential strategic surprise from AI.
Johnson, James. “Artificial Intelligence: A Threat to Strategic Stability.” Strategic Studies Quarterly 14, no. 1 (Spring 2020), 16–39. Explains how and why AI-augmented conventional capabilities could affect strategic stability between China and the United States, including nuclear stability.
Kissinger, Henry, Eric Schmidt, and Daniel Huttenlocher. “The Metamorphosis.” The Atlantic, August 2019. Discusses the AI revolution being unstoppable and how societies should prepare for changes to human knowledge, perception, culture, civilization, and history.
Knight, Will. “The Dark Secret at the Heart of AI.” MIT Technology Review (April 11, 2017). Examines deep learning’s “black box” and unexplainability problem—that is, decisions made are uninterpretable, even by design engineers.
Pfaff, Anthony C. “The Ethics of Acquiring Disruptive Technologies: Artificial Intelligence, Autonomous Weapons, and Decision Support Systems.” PRISM 8, no. 3 (January 2020), 129–143. Discusses the ethics of military AI systems that may be involved in life and death decisions, including both lethal autonomous weapons systems and decisions support systems. Also examines the responsibility gap and potential moral hazards of AI systems.
Ryan, Mick. “Intellectual Preparation for Future War: How Artificial Intelligence Will Change Professional Military Education.” War on the Rocks, July 3, 2018. Argues that AI could result in new warfighting concepts and operational approaches, requiring military officers conversant in the strengths and weaknesses of AI. Asserts that a larger percentage of military personnel literate in AI and advanced technologies are required.
Scharre, Paul, and Michael C. Horowitz. Artificial Intelligence: What Every Policymaker Needs to Know. Washington, DC: Center for a New American Security, June 2018, 3–16. Discusses AI’s cognitization effect on innovation and economic growth. Also considers unsupervised learning versus deep learning. Examines AI strengths, such as data classification, anomaly detection, prediction, and optimization, and limitations, such as brittleness, predictability, and explainability.
Vinci, Anthony. “The Coming Revolution in Intelligence Affairs.” Foreign Affairs, August 31, 2020. Argues that the amount of data available to intelligence agencies has driven the adoption of AI. Continual shifting and adapting of intelligence agencies are critical to budgets, personnel, and training for AI-driven systems.
White, Sam. “AI and the Urgency of Finishing First.” The War Room Online, November 27, 2018. Discusses AI as an enabling technology that will change the character of war. Also posits strategic competitors could build sustainable advantage if they develop AI first.
Notes
1 See Chairman of the Joint Chiefs of Staff Instruction 1800.01F, Officer Professional Military Education Policy (Washington, DC: The Joint Staff, May 15, 2020), available at <https://www.jcs.mil/Portals/36/Documents/Doctrine/education/cjcsi_1800_01f.pdf?ver=2020-05-15-102430-580>.
2 Developing Today’s Joint Officers for Tomorrow’s Ways of War: The Joint Chiefs of Staff Vision and Guidance for Professional Military Education & Talent Management (Washington, DC: The Joint Staff, May 1, 2020), available at <https://www.jcs.mil/Portals/36/Documents/Doctrine/education/jcs_pme_tm_vision.pdf?ver=2020-05-15-102429-817>.
3 Ibid., 3–4.
4 Summary of the 2018 National Defense Strategy of the United States of America: Sharpening the American Military’s Competitive Edge (Washington, DC: Department of Defense, 2018), 3.
5 Developing Today’s Joint Officers for Tomorrow’s Ways of War, 4.
6 Louisa Fitzgerald, “Ten Emerging Technologies Making an Impact in 2020,” CompTIA, June 10, 2020, available at <https://www.comptia.org/blog/emerging-technologies-impact-2020>.
7 National Research Council (NRC), Persistent Forecasting of Disruptive Technologies (Washington, DC: National Academies Press, 2010), 11.
8 Kelley M. Sayler, Emerging Military Technologies: Background and Issues for Congress, R46458 (Washington, DC: Congressional Research Service, updated November 10, 2020).
9 “Secretary of Defense Mark T. Esper: Message to the Force on Accomplishments in Implementation of the National Defense Strategy,” transcript, July 7, 2020, available at <https://www.defense.gov/Newsroom/Transcripts/Transcript/Article/2266872/secretary-of-defense-mark-t-esper-message-to-the-force-on-accomplishments-in-im/>.
10 “AI Manoeuvres: Business Lessons from the Pentagon,” The Economist, May 30, 2020, 61–62.
11 Joint Artificial Intelligence Center (JAIC), “Vision: Transform the DOD Through Artificial Intelligence,” Chief Information Officer, U.S. Department of Defense, available at <https://dodcio.defense.gov/About-DoD-CIO/Organization/jaic>.
12 Ibid.
13 From the experiences of the members of JAIC’s community of interest, particularly Andy Leith, who teaches the AI Industry Study at the Eisenhower School for National Security and Resource Strategy at the National Defense University (NDU), a partial list of possible choices for dynamic and relevant guest speakers would include Eric Schmidt, Defense Innovation Board, former Google/Alphabet chairman; Paul Scharre, senior fellow and director of the Center for a New American Security (CNAS); Vint Cerf, Internet pioneer (co-inventor of TCP/IP) and “chief Internet evangelist” for Google; Robert Work, former Deputy Secretary of Defense, now at CNAS; Robert Atkinson, president of the Information Technology and Innovation Foundation; Jason Matheny, director of the Center for Security and Emerging Technology at Georgetown; Yll Bajraktari, executive director, National Security Commission on Artificial Intelligence; P.J. Maykish, senior military fellow, Center for Strategic Research, Institute for National Security Studies, at NDU, detailed full time to the National Security Commission on Artificial Intelligence; Robert Spalding, senior fellow, Hudson Institute; Reginald Hobbs, Army Research Laboratory and adjunct lecturer, Howard University; Darryl Ahner, interim dean for research, Air Force Institute of Technology; Sam White, deputy director, Center for Strategic Leadership, U.S. Army War College; Gavin Taylor, associate professor of computer science, U.S. Naval Academy; Nate Bastian, chief data scientist, Army Cyber Institute; and Chuck Howell, chief scientist responsible for AI, MITRE.
14 Developing Today’s Joint Officers for Tomorrow’s Ways of War, 3.
15 For example, Mick Ryan, “Intellectual Preparation for Future War: How Artificial Intelligence Will Change Professional Military Education,” War on the Rocks, July 3, 2018, available at <https://warontherocks.com/2018/07/intellectual-preparation-for-future-war-how-artificial-intelligence-will-change-professional-military-education/>; Mick Ryan, “Integrating Humans and Machines,” The Strategy Bridge, January 2, 2018, available at <https://thestrategybridge.org/the-bridge/2018/1/2/integrating-humans-and-machines>.
16 Developing Today’s Joint Officers for Tomorrow’s Ways of War, 4. Emphasis added.
17 Ibid.
18 NRC, Persistent Forecasting of Disruptive Technologies.