Browse by

Publications

News | Sept. 30, 2014

Is Military Science “Scientific”?

By Glenn Voelz Joint Force Quarterly 75

The term military science generally describes the body of theories, concepts, and methods for employing armed forces. However, as an academic discipline it is ill defined, drawing from a patchwork of curricula including history, foreign affairs, security studies, leadership, operations management, and systems engineering, as well as other elements of the physical and social sciences. Notably, the Department of Defense dictionary does not even provide a definition. This vague categorization is somewhat reflective of the term’s diminished status from its 19th-century usage when Military Science was frequently capitalized and placed alongside Physics, Philosophy, and other well-established academic disciplines.

Senior Airmen program Wireless Gate Release System before airdrop at Bagram Air Field (U.S. Air Force/Evelyn Chavez)

Senior Airmen program Wireless Gate Release System before airdrop at Bagram Air Field (U.S. Air Force/Evelyn Chavez)

An irony of the term’s decline is that it occurred over a period when military professionals increasingly conceptualized their discipline in the terminology and metaphors of science. This transformation was driven in part by the institutionalization of officer education programs emphasizing the formalized study of military theory. A second factor, rapid industrialization, firmly established science and technology as the central pillars of American military power and arguably the foundational elements in approaches to doctrine and planning. These trends reinforced the proposition that the practical application of military theory, as expressed through strategy, doctrine, and planning, was becoming more of a science and less of an art. This perspective has reached an apex in recent decades, epitomized by doctrinal methodologies seeking to reduce decisionmaking to formulaic processes—not unlike the methods used by chemists mixing compounds for desired effect. In particular, there has been a tendency toward instrumental applications of descriptive theory attempting to distill complex social dynamics into bounded problem statements that fit neatly into proscribed planning schemas and process solutions.1

Military science certainly shares some basic traits with the physical sciences in the use of observation, description, measurement, and structured analysis supporting causal inferences or explanatory hypotheses. However, military science remains distinct from the physical sciences in significant ways, most notably in the absence of controlled, replicable experimentation as means of validating theory. For this and others reasons, the conceptual foundations of the field reside more appropriately in the realm of the social sciences. While this conclusion may be intuitively obvious to most military professionals, its practical implications are increasingly overlooked and are reflective of a deep and persistent strain of “scientism” within the intellectual foundation of American approaches to military theory, doctrine, and planning.2

Origins of American Military Scientism

Observers have long suggested a distinct techno-scientific orientation as the defining characteristic of American approaches to strategy, doctrine, and planning. Early military theory in the United States was based largely on inherited European traditions profoundly influenced by Newtonian logic with emphasis on deterministic relationships and predictable linear interactions between forces.3 Discovery of laws describing the natural universe led to the search for similar constants governing interactions among armies in the field. Such early examples of “military scientism” reflected a growing belief that warfare, like other natural phenomena, could be analyzed to reveal basic patterns and predictable characteristics.

These precedents deeply influenced early American approaches to military theory requiring that authoritative scientific principles serve as the basis for doctrinal approaches, while technological innovation came to be viewed as the transformational element in the history of warfare. The founding of West Point in the early 19th century reflected these influences, particularly under the early leadership of superintendent Sylvanus Thayer, who firmly entrenched a technical and engineering-based curriculum as the preferred intellectual foundation for military leaders. This approach was reinforced under Professor Dennis Hart Mahan, who was instrumental in transferring European knowledge and practices to the Academy with particular emphasis on engineering, fortifications, ballistics, and topography as core elements of military education.

Within this context, military theorist Baron de Jomini emerged as perhaps the most influential theorist in 19th-century America. The Swiss-born officer held that all strategy was “controlled by invariable scientific principles” and attempted to reduce its conduct to prescriptive rules deeply rooted in empirical methods and analysis of historical example.4 Indeed, his “scientifically” derived concepts of mass, maneuver, and lines of operation remain central to American doctrine and military theory to this day.

Carl von Clausewitz was the other dominant influence on late 19th-century American military thinking. With his emphasis on complexity and ambiguity, Clausewitz is often viewed as the theorist more relevant to modern “nonlinear” warfare, yet his vocabulary also reflects the powerful influences of Renaissance-era science, particularly his use of Newtonian analogies—force, mass, center of gravity—to describe the nature of armed conflict.5 Indeed, central to Clausewitzian thought is the concept of “friction,” illustrating the role that chance and uncertainty play as determining factors in war. Like Jomini, Clausewitz shared the view that knowledge of science combined with practical experience and deep study of history was fundamental in preparation for command. However, he was less convinced of the utility of universal principles and sacrosanct theory as guides to the conduct of war. Rather, Clausewitz suggested that the purpose of theory was to educate the mind of a leader rather than “accompany him to the field of battle.”6 Furthermore, he cautioned against the tendency for theory to furnish commanders with positive doctrines and systems to be used “like mental appliances.”7

Within this intellectual milieu evolved the concurrent phenomena of military professionalization and industrialization, both serving to reinforce America’s emerging techno-scientific approach to warfare. Lessons of the Civil War awakened theorists to the criticality of mobility, logistics, and industrial production as central aspects of strategic calculation. Additionally, the decades prior to World War I marked a period of intense scientific, technological, and industrial innovation transforming the practice of warfare with the introduction of radio, submarines, airplanes, automobiles, machineguns, and high explosives.

Theorists and planners were not only embracing the promise of new technology but also examining how scientific methods and modern management practices could be transferred from the laboratory and factory floor to the battlefield. Development of the modern staff system and functional specialization reflected this impulse, necessitated in part by the increasingly complicated management tasks associated with mass mobilization and logistical demands of industrial age warfare. This evolution also demanded more formalized systems of military training and education with an emphasis on structured methodologies and codified doctrine. Just as scientific management practices rationalized the process of industrial production, military theorists attempted to bring “order, regularity, and predictability” to the practice of war.8

Among influential 20th-century military theorists, B.H. Liddell Hart was one of the more devout believers that the scientific study of warfare would reveal “a few truths of experience which seem so universal, and so fundamental, as to be termed axioms.”9 Though best known for his advocacy of the “indirect approach” and tenets of maneuver warfare, Hart’s thinking reflected an increasingly influential pedagogical perspective viewing history as the laboratory of military science. “If the study of war in the past has so often proved fallible as a guide to the course and conduct of the next war,” he noted, “it implies not that war is unsuited to scientific study but that the study has not been scientific enough in spirit and method.”10

J.F.C. Fuller, another dominant intellectual influence of the interwar period, took this notion to its logical conclusion and argued for direct application of scientific methodologies to the study of warfare, asserting nothing less than his desire “to do for war what Copernicus did for astronomy, Newton for physics, and Darwin for natural history.”11 Through exhaustive historical analysis of warfare from antiquity to the modern era, Fuller became convinced that such methods would “enable the student to study the history of war scientifically, and to work out a plan of war scientifically, and create, not only a scientific method of discovery, but also a scientific method of instruction.”12

New York Air National Guard’s 109th Airlift Wing flies LC-130 over Greenland on mission to resupply remote science research outposts (DOD/Fred W. Baker II)

New York Air National Guard’s 109th Airlift Wing flies LC-130 over Greenland on mission to resupply remote science research outposts (DOD/Fred W. Baker II)

The views of Hart and Fuller reflected a growing confidence in the promise of scientifically managed warfare based on technological innovation and empirically derived approaches. This phenomenon was not limited to land warfare. Strains of such thinking were clearly present in Alfred Thayer Mahan’s theories on seapower and the interplay of technology, geography, and tactical principles. Airpower theory was equally driven by techno-scientific approaches exemplified by influential thinkers such as Giulio Douhet, Billy Mitchell, and Hugh Trenchard, who variously promoted strategies based on innovative technologies linked with theoretical yet largely unproven principles of employment and effect.

World War II came closer than any modern conflict to validating the notion that the coupling of technology and scientific management could deliver desired and predictable strategic ends. Paul Kennedy’s recent study of the conflict masterfully depicts a “scientists’ war” highlighting the remarkable achievements of mid-level engineers and managers who developed technical, organizational, and process innovations to overcome many of the war’s biggest challenges. Kennedy focuses particularly on issues such as convoy security, strategic bombing, and amphibious landings, where rapid fielding of technical solutions combined with doctrinal and tactical adaptability delivered significant and measurable advantages that proved decisive in winning the war.13

By this analysis, World War II may be read as vindication of the techno-scientific approaches advocated by Jomini, Hart, and Fuller. However, one must consider whether the war represented an exemplar or an isolated aberration. First, one is struck by the remarkable symmetry in means and method of the major combatants, particularly in terms of technological sophistication, industrialization, organizational structures, and, to some degree, doctrinal approach. Certainly when contrasted with other conflicts of the modern era, it is the similarities between combatants more than the differences that seem noteworthy. Moreover, Kennedy notes that many of the central military challenges of the conflict—issues of time, distance, and production—were problems particularly well suited to structured analysis and technical and managerial solutions. Multiple elements central to wartime strategy such as convoy security and strategic bombing provided relatively straightforward feedback loops enabling clear analysis, unambiguous experimentation, and rapid implementation of functional solutions.

In any case, lessons of victory profoundly influenced subsequent approaches of the Cold War era. From the tactical to the strategic level, the military turned to applied science, operations research, and systems analysis to address the most complex national security challenges of the postwar period. Characteristics of the principal Cold War adversaries—structured, homogenous, hierarchical, and doctrinally based—served to reinforce the conclusion that military planning and decisionmaking might be mastered through algorithms and process models. The field of intelligence as much as any other became defined by such approaches. Technical collection capabilities managed by centralized bureaucracies proved remarkably effective at producing detailed information on highly structured conventional threats. In other respects, the rise of the Cold War–era techno-scientific regime was necessitated by the increasingly complicated demands of managing a massive and widely dispersed standing military. Theorist Martin van Creveld observed that the expanding scope of military operations, logistics networks, and occupational specialization increasingly demanded centralized control and the leveraging of science, mathematics, and advanced communications to enable effective coordination on such a massive scale.14 This trend naturally reinforced reliance on systems analysis, operational research, and statistical methodologies as basic tools for military decisionmaking and planning.

These trends had a profound influence during the Vietnam conflict on approaches employed by Defense Secretary Robert McNamara, particularly efforts to translate tactical feedback into quantifiable metrics for analyzing and guiding strategic level decisionmaking. Antoine Bousquet describes the concept of “cybernetics” evolving out of World War II that engendered an “understanding of war which strove to frame the use of military force into an activity totally amenable to scientific analysis, to the detriment of other forms of thought.”15 However, these shortcomings did little to challenge the prevailing notion that warfare could be analyzed and managed with scientific precision. Bousquet cites as a high point of this trend the advent of theories formalized under the rubric of “revolution in military affairs” (RMA) in the decades following Vietnam.

The essence of RMA maintained that technological innovation and integrated advances in weapons, information processing, communications, organizational management, and doctrinal approaches would be the primary drivers of future military advantage. RMA emphasized operations research and systems analysis to frame strategy and planning decisions as engineering problems to be solved through data collection and analysis, presuming that measurable risk and outcome probabilities could be estimated with reasonable confidence through adherence to doctrinal methods. These process-oriented methods became increasingly formalized and to this day dominate the pedagogical approach to professional military education.

Even with the end of the Cold War, military theory and doctrinal development continued to reflect the persistent influence of the techno-scientific approaches, notably with concepts such as network-centric warfare and effects-based operations, ideas closely related to the cybernetic methods of the Vietnam era and later RMA efforts. These doctrinal theories were premised on analyzing the battlefield environment as a holistic system of interdependent nodes and causal linkages that could be identified and acted upon with measured and predictable effect. This process was enabled by conceptual models such as operational net assessment and system-of-systems analysis. These models apply computational tools, algorithms, and data-intensive analyses to disaggregate key dynamics of a given operational environment and then revisualize their environments as coherent and holistic systems.

After a decade of conflict defined by unconventional adversaries, complex environments, and ambiguous operational endstates, a new era of military scientism is already taking form. The contours of this next evolution might be described as “post-Newtonian, post-Jominian.” Army Design Theory has emerged as the conceptual basis of a new approach to planning in complex environments. Meanwhile, military theorists are looking to fields such as advanced mathematics, theoretical physics, and biology for insights into complex system behavior and modeling intervention strategies. Other efforts are exploring chaos theory and related fields for tools to analyze environmental propensities of conflict zones, emergent security instabilities, and mapping system dynamics of terrorist networks and insurgencies. Despite a new vocabulary, the essence of these approaches remains firmly grounded in the basic presumptions of the techno-scientific regime. By all evidence, military scientism remains as powerful an influence as ever in the American tradition.

Fatal Striving: Hayek, Scientism, and the Limits of Useful Knowledge

Friedrich Hayek identified a similar phenomenon in his own field of economics, notably articulated during his 1974 Nobel Prize lecture in which he cautioned colleagues against misapplication of scientific-like methods to tasks for which they were unsuited. Hayek expressed concern that “confidence in the unlimited power of science is only too often based on a false belief that the scientific method consists in the application of a ready-made technique, or in imitating the form rather than the substance of scientific procedure, as if one needed only to follow some cooking recipes to solve all social problems.”16 His criticisms were directed at the intersection of the social sciences and public policy where he saw vague imitations of scientific methodologies applied inappropriately to management of complex social phenomena. He labeled such practices intellectual “charlatanism” intended primarily for the purpose of lending legitimacy and pretense of precision to policy proscriptions amounting to little more than blind tinkering in areas where fundamental uncertainty prevailed. Indeed, Hayek could well have been speaking of military science when he described the curious task of economics as demonstrating “to men how little they really know about what they imagine they can design.”17

As a young soldier in the Austro-Hungarian army along the Italian front during World War I, Hayek certainly did not lack exposure to the complexity and arbitrariness of armed conflict. Later in his career, he described the inherent challenges of decisionmaking in environments characterized by fragmentary information. He was particularly interested in how such systems resisted submission to hierarchical, centralized planning—a notion directly challenging the fundamental premise of deliberate design.18 Though not a military theorist per se, Hayek’s insights into the use of knowledge, function of complex systems, and dangers of scientism all offer important lessons for the contemporary strategist, planner, and student of military theory.

A foundational element of Hayek’s worldview relates to his observations concerning the “unavoidable imperfection of man’s knowledge.”19 The phrase should not be misunderstood as resignation to intellectual nihilism. Rather, it reflects a profound insight about the nature of information, particularly pertaining to environments where data is dispersed, tacitly understood, or in forms resistant to detection, collection, and analysis, thus rendering it too subjective to be a basis for scientifically valid conclusions. In this sense, Hayek describes the essence behind Clausewitz’s famous dictum that intelligence reports in war are often “contradictory; even more are false, and most are uncertain.”20 As a result, theory formation in the social sciences is often a function of information availability.21 This situation naturally promotes forms of selection bias when information critical to understanding system behavior is too disaggregated for systematic collection or simply ignored due to its uncertain significance. Bousquet as well as military theorist Martin van Creveld identified such “information pathologies” during the Vietnam conflict where pseudo-scientific approaches to strategy evolved based on the most easily quantifiable characteristics of the battlefield, thereby conflating counting with understanding.22

A widely circulated recent paper concerning intelligence in Afghanistan noted that even after a decade of war, the American military still finds “itself unable to answer fundamental questions about the environment in which we operate.”23 The authors posit that a central problem has been the inability to aggregate useful information existing at the lowest levels for use by higher level decisionmakers, noting that the ground soldier or local development worker is generally best informed about their particular environment, while the path “up through the levels of hierarchy is normally a journey into greater degrees of cluelessness.” The paper identifies the central obstacle to gathering and acting upon relevant information as a matter of inadequate organizational structure. Conversely, Hayek would say that the basic issue is not a result of flaws in organizational structure, but rather something more fundamental about the nature of knowledge in complex systems. He points out that circumstances defining outcomes in complex environments are rarely, if ever, fully accessible to the social scientist, policymaker, or military planner, no matter how information is collected and acted upon.

To some degree, this situation reflects the inescapable reality of military science and the fundamental epistemological challenge of analyzing complex social phenomena. With historical example as its laboratory, military theory relies on ex post facto analysis of what are essentially natural experiments. This entails several limitations. As a mode of analysis, historical narrative is fundamentally linear and deterministic by nature. Its aim is to find causality, thereby minimizing the role of chance. It veils complexity and shies from ambiguity. Its vernaculars tend toward the anecdotal, interpersonal, and spectacular. History does not always know what it does not know. Ultimately, what it provides is reasoning by induction—drawing general rules from specific examples. It is non-empirical in that it relies on uncontrolled data. Perhaps most importantly, as a basis for applied theory, it lacks mechanisms of validation through experimental replication—the essence of scientific methodology.

In his recent book, Jim Manzi suggests the limited practical utility of the nonexperimental social sciences, noting these fields are generally “not capable of making useful, reliable, and non-obvious predictions for the effects of most proposed policy interventions.”24 However, in the case of military science, historical interpretations often become proxy for theory or, at the very least, the basis for instrumentalist approaches to operational decisionmaking. Unlike in the physical sciences where a hypothesis may be proposed, tested, and potentially disproved, military science generally does not offer falsifiable propositions. This characteristic, according to Karl Popper, is what distinguishes science from pseudo-science and separates technical prediction from mere “prophecy.”25 Clausewitz was sensitive to these limitations as well, noting that “no empirical science, consequently also no theory of the art of war, can always corroborate its truths by historical proof.”26 Notwithstanding General George Patton’s assertion that the successful soldier must know history, recent scholarship by Daniel Kahneman, Phillip Tetlock, Nassim Talib, and others suggests substantive limitations in applying historical pattern analysis as a basis for predictive decisionmaking, particularly in the case of unstructured problems and complex systems.

Much of Kahneman’s work on bias and systematic error in expert judgment focuses on the limitations of derived heuristics in fields dependent on analysis of historical case study.27 This mode of theorizing reinforces a powerful human tendency to think in terms of association, metaphor, and inferred causality, with cognitive strategies giving rise to rules of thumb based on crude pattern recognition. Kahneman suggests such techniques feed overconfidence based on the certainty of hindsight, leading planners to view the world as far more coherent and orderly than it is. Others have termed this tendency “folk science” whereby humans naturally create “illusions of explanatory depth” in their analysis of complex functions, often entirely unaware how this masks inaccuracies in understanding.28 All of these factors entail what Kahneman calls the “planning fallacy,” or tendency to underestimate the difficulty of implementing a plan while simultaneously overestimating one’s ability to shape future outcomes.

However superficially military planning methodologies may resemble scientifically derived processes, Hayek reminds us that the enormous predictive power of the physical sciences is based on laws derived from experiments with relatively few variables that may be isolated and carefully measured, whereas complex social phenomena inevitably involve indeterminable variables either unmeasurable or unknown to the observer. Even in the best of circumstances, use of scientific-like methods of analysis offer little more than crude pattern prediction or only a generalized understanding of system dynamics.

Clausewitz famously observed that “three quarters of the factors on which action in war is based are wrapped in a fog of greater or lesser uncertainty.”29 Hayek certainly would agree. He reminds us that in fields where essential complexity exists, the planner must understand that “he cannot acquire the full knowledge which would make mastery of the events possible.”30 Even as the methodologies of the physical sciences are lavishly imitated, the nature of the problems facing military planners cannot produce equally structured outcomes. One significant reason is that intelligence can never resemble the process of data collection in a laboratory, no matter the level of technical sophistication.

Conclusion

Having rediscovered the primacy of Clausewitzian ambiguity, some theorists now propose Army Design Theory as a means to disentangle complex causality and deliver improved strategies of intervention. It is at this point where caution is warranted. An unfortunate symptom of military scientism has been the tendency for planners to conflate the precision of their tools (weapons and systems) with the methods of their application (theories and doctrine). While the technologies of modern warfare function primarily in a Newtonian universe, methods of their application still reside stubbornly in a Hayekian one. Confusion over this point gets to the heart of the dilemma with military scientism.

Command element from Arkansas Army National Guard’s 142nd Fires Brigade looks over map of Woodruff County in eastern Arkansas in effort to deploy troops in support of evacuation operations due to flooding (DOD/Chris Durney)

Command element from Arkansas Army National Guard’s 142nd Fires Brigade looks over map of Woodruff County in eastern Arkansas in effort to deploy troops in support of evacuation operations due to flooding (DOD/Chris Durney)

Arguably much of what passes for military planning is less analytically rigorous than what meets the eye. The fixtures of doctrinal orthodoxy have created an aura of pseudo-scientific infallibility in the military planning process, rendering its outputs impervious to rational critique. However, too often doctrine is little more than a fig leaf concealing a process driven by gut-feeling heuristics and unsubstantiated causal suppositions. Whereas doctrine should serve the useful function of providing a common language and frame of reference, it also has the undesirable effect of reinforcing the cult of expertise, thereby discouraging integration of diverse tools and nontraditional thinking. This is where it becomes dangerous. As Malcolm Gladwell has noted, whereas incompetence is the malady of the novice, overconfidence is the disease of the expert.31 And it is generally the expert who possesses the greatest potential for creating disasters.

Clausewitz was well aware of the potential dangers of scientism and warned that “much greater is the evil which lies in the pompous retinue of technical terms—scientific expressions and metaphors” that “lose their propriety, if they ever had any, as soon as they are distorted, and used as general axioms, or as small crystalline talismans.”32 In this respect, a healthy dose of Hayekian thinking provides a natural “dampening effect” against unrealistic aspirations. While Hayek’s insights dealt primarily with functions of economic markets, the same dynamics apply to military conflict or any other human activity defined by conditions of uncertainty, analytical ambiguity, and predictive indeterminacy. What a Hayekian worldview demands is that one trade certainty for humility, appreciate the limits of useful knowledge, and recognize that plans do not represent extension of the will. Skepticism must be the order of the day, placing the burden of proof on the doctrinarian.

As proscription for correcting the worse abuses of military scientism, leaders might benefit from considering methods from other fields that at first glance may not seem intuitively similar to military operations such as biology, epidemiology, or meteorology. These disciplines may offer helpful examples for how military planners can better appreciate the natural limitations of their craft, improve techniques of meta-cognition, and gain greater sensitivity to the uses and abuses of probability. Likewise, repositioning military science as an academic discipline of equal stature with established social sciences will invite both scrutiny over our methods as well as beneficial cross-pollination and improved awareness of our biases.

In the end, we must seek a defensible space between helpless indifference and the present hubris that drives the lofty ambitions of many military planners. One must appreciate that in some situations intuition, training, and experience are simply not enough to endow one with sufficient awareness to predict outcomes with a reasonable degree of certainty. Indeed, the ability to recognize these limits and approach them with humility and intellectual honesty is perhaps the truest mark of a professional. JFQ


Lieutenant Colonel Glenn Voelz, USA, is Chief of the Intelligence Control Division, Directorate for Intelligence (G2), U.S. Army Africa.

Notes

  1. Descriptive theory based on retrospective analysis where variables and environmental conditions cannot be fully known or controlled, versus predictive theory offering falsifiable propositions subject to experimental testing and validation.
  2. Scientism as applied by Friedrich Hayek, Karl Popper, and others to describe inappropriate application of scientific-like methods to contexts where they do not clearly fit or with insufficient empirical evidence to support scientifically valid theories; also processes designed to appear science-like yet lacking rigorous procedural, methodological, or analytical standards.
  3. Notably in Antoine Bousquet, The Scientific Way of Warfare: Order and Chaos on the Battlefields of Modernity (New York: Columbia University Press, 2009).
  4. John Shy, “Jomini,” in Makers of Modern Strategy: From Machiavelli to the Nuclear Age, ed. Peter Paret, 146 (Princeton: Princeton University Press, 1986).
  5. Among others, see Alan Beyerchen, “Clausewitz, Nonlinearity and the Unpredictability of War,” International Security 17, no. 3 (1992), 59–90, available at <www.clausewitz.com/readings/Beyerchen/CWZandNonlinearity.htm>.
  6. Carl von Clausewitz, On War, trans. James John Graham (London: Trübner, 1873), book 2, chap. 4.
  7. Ibid.
  8. Bousquet, 30.
  9. B.H. Liddell Hart, Strategy of the Indirect Approach (London: Faber and Faber, 1954), 234, available at <http://archive.org/stream/strategyofindire035126mbp#page/n15/mode/2up>.
  10. B.H. Liddell Hart, Why Don’t We Learn from History (Ann Arbor: Hawthorn Books, 1972), 16, available at <http://pkpolitics.com/files/2008/05/liddell-hart-why-dont-we-learn-from-history.PDF>.
  11. J.F.C. Fuller, The Foundations of the Science of War (London: Hutchinson, 1926), available at <www.cgsc.edu/Karl/download/csipubs/FoundationsofScienceofWar.pdf>.
  12. Ibid., 35.
  13. Paul Kennedy, Engineers of Victory: The Problem Solvers who Turned the Tide in the Second World War (New York: Random House, 2013).
  14. Martin van Creveld, Command in War (Boston: Harvard University Press, 1985), 106.
  15. Antoine Bousquet, “Cyberneticizing the American War Machine: Science and Computers in the Cold War,” Cold War History 8, no. 1 (2008), 77–102.
  16. Friedrich Hayek, “The Pretense of Knowledge,” American Economic Review 79, no. 6 (1989).
  17. Friedrich Hayek, The Fatal Conceit: The Errors of Socialism (Chicago: University of Chicago Press, 1988), 76.
  18. Friedrich Hayek, “The Use of Knowledge in Society,” American Economic Review, 4 (September 1945), 519–530, available at <www.econlib.org/library/Essays/hykKnw1.html>.
  19. Ibid., 530.
  20. Carl von Clausewitz, On War, ed. and trans. Michael Howard and Peter Paret (Princeton: Princeton University Press, 1984), 117.
  21. Hayek, “Pretense of Knowledge.”
  22. Van Creveld, 240.
  23. Michael T. Flynn, Matt Pottinger, and Paul D. Batchelor, Fixing Intel: A Blueprint for Making Intelligence Relevant in Afghanistan (Washington, DC: Center for a New American Security, 2010), available at <www.cnas.org/files/documents/publications/AfghanIntel_Flynn_Jan2010_code507_voices.pdf>.
  24. Jim Manzi, Uncontrolled: The Surprising Payoff of Trial and Error for Business, Politics, and Society (New York: Basic Books, 2012), xi.
  25. Karl Popper, The Poverty of Historicism (New York: Routledge, 2002).
  26. Clausewitz, On War, trans. Graham, book 2, chap. 6.
  27. Daniel Kahneman, Thinking Fast and Slow (New York: Farrar, Straus, and Giroux, 2011).
  28. Leonid Rozenblit and Frank Keil, “The Misunderstood Limits of Folk Science: An Illusion of Explanatory Depth,” Cognitive Science 26, no. 5 (2002), 521–562.
  29. Clausewitz, On War (Princeton), 104.
  30. Hayek, “Pretense of Knowledge,” 7.
  31. Malcolm Gladwell, C-SPAN interview with Brian Lamb, November 30, 2009.
  32. Clausewitz, On War, trans. Graham, book 2, chap. 6.