News | Jan. 18, 2023

America Must Engage in the Fight for Strategic Cognitive Terrain

By Daniel S. Hall Joint Force Quarterly 108

Download PDF

Colonel Daniel S. Hall, USA, is the Director of Intelligence, Joint Task Force North, El Paso, Texas.
President Joe Biden delivers remarks on war in Ukraine

The role of nonmilitary means of achieving political and strategic goals has grown, and, in many cases, they have exceeded the power of force of weapons in their effectiveness. . . . The information space opens wide asymmetric opportunities to reduce the combat potential of the enemy.

—Valery Gerasimov
Chief of the General Staff of the Russian Armed Forces1

In February 2017, rumors of a Lithuanian girl’s rape by German soldiers belonging to a North Atlantic Treaty Organization (NATO) battlegroup rapidly spread on social media. The allegations evoked visceral reminders of Nazi occupation during World War II. Despite the Lithuanian government’s insistence that the rape never occurred, the persistent rumor jeopardized Germany’s participation in NATO’s Enhanced Forward Presence mission. NATO suspected that the rumor originated in a Russian propaganda source. The rumor was eventually quieted, with NATO commanders stressing that defending against false narratives is essential for sustaining the Alliance’s cohesion.2 Modern societies live in an information-saturated age, in which manipulators take advantage of environmental and human factors to make it difficult for people to distinguish truth from fiction. This opening vignette serves as a rudimentary example of how propagandists exploit these factors to weaponize information to advance their political agenda.

Strategic competitors seek influence over geopolitical relationships to balance against the United States. However, they generally deem direct military confrontation too risky to achieve their strategic aims. Therefore, instead of a purely forceful approach, they may seek opportunities to employ psychological, ideological, and informational approaches waged within gray zones to unbalance U.S. hegemony.3 The term gray zone is commonly associated with military operations that blur the lines between war and peace. However, gray zone in the context of this article is used to describe the application of nonmilitary means that couple advancements in psychosocial science with cutting-edge information technology in psychological capitulation strategies intended to erode the West’s will to resist. The manipulation of strategic cognitive terrain via gray competition zones characterizes modern warfare, serving as an example of an attack on the people’s “passion” part of Carl von Clausewitz’s “paradoxical trinity.”

Clausewitz emphasized in his unfinished manuscript On War that war’s nature requires the continual balancing of passion, chance, and reason.4 Imbalance between the trinity can tip significant strategic advantages to an opponent. The irreversible psychological momentum (that is, reason) that North Vietnam gained once American societal support (passion) eroded following the 1968 Tet Offensive (chance) exemplifies the strategic repercussions that can occur when the paradoxical trinity is disturbed.

German film director Leni Riefenstahl looks through large camera with cinematographer Sepp Allgeier during Nazi Party Congress in Nuremberg

Many experts agree that U.S. national security is increasingly threatened as opponents push anti-West information toward the center of conflict.5 However, few publications offer recommendations for ways the U.S. military can defend against perceptual manipulation. Countering weaponized information with military means is problematic, because liberal societies value well-intentioned, credible information. Additionally, political scientist Joseph Nye counsels that informational credibility prospers in uncensored and critical civil societies, whereas government-subsidized information is perceived as “rarely credible.”6 Thus, the U.S. aversion to government-sponsored ideological messages hampers the military’s ability to counter threat narratives.

Given the relative ease with which adversaries conduct perceptual manipulation operations that dominate strategic cognitive terrain, inaction is no longer a viable option. This article therefore seeks to arm the Department of Defense (DOD) with ways to close exploitable cognitive gaps where malignant information thrives. Cognitive dissonance theory and interrelated psychodynamic concepts are introduced to illustrate the relative ease with which societal perceptions are manipulated. These concepts are applied to Russia’s fight for strategic cognitive terrain to demonstrate how rivals manipulate societies to realize their national security aims. Recommendations are also provided to help the U.S. military operationalize global integrated plans that protect the strategic cognitive domain against societal perceptual manipulation.

Shades of Propaganda

In the article “Propaganda: Can a Word Decide a War?” Dennis Murphy and James White reference the Joint Chiefs of Staff definition of “propaganda”: “any form of communication in support of national objectives designed to influence the opinions, emotions, attitudes, or behavior of any group in order to benefit the sponsor, either directly or indirectly.”7 Propagandists have historically combined compelling images with manipulated narratives to sway human affect. Consider the film Triumph of the Will, intended to legitimize Adolf Hitler’s Nazi ideology, which demonstrates the power of connecting dazzling imagery with messaging to influence opinion. At the time, such far-reaching propaganda campaigns could be lengthy and expensive undertakings. In contrast, modern communications afford states relatively cheap means by which to transmit appealing messages at a ceaseless pace. Contemporary societies are bombarded by captivating stimuli as a result. Tidal waves of information make it nearly impossible to sift through terabytes of data to identify the discrete bits that reveal truth. Protecting populations against propaganda is difficult because individual personality traits affect each person’s susceptibility to manipulation. Modern communication’s ease at transmitting information therefore opens endless opportunities for adversaries to broadcast ever more dangerous genres of propaganda.

Figure 1. Propaganda Zones

Strategic communications expert Donald Bishop classified people’s individual information vulnerabilities into black, white, and gray zones (see figure 1).8 The willingly deceived reside in the black zone social space. Their rejection of universally accepted explanations makes them unreliable collaborators; all sides can easily mislead them. Equally in the minority are those in the white zone, whose high standard for determining truth makes them hard to fool. Most of the strategic cognitive terrain is the gray zone, occupied by people who are influenced by catchy headlines and other forms of “click bait” and form their judgments in part on that basis. Such consumers of information are usually not happily deceived. However, whereas information is plentiful, human attentional resources are extremely limited. Human task-shedding tendencies to alleviate cognitive load lower the threshold for determining truth. Gray zone propaganda therefore constitutes the most dangerous form of propaganda, because these time-saving measures often lead to misjudgments by people in the gray zone.

It is important to study the mechanisms of gray propaganda to lay a foundation for understanding how exploiters compete for attention. Unlike black propaganda, which attributes the origins of dishonest information to false sources, gray propaganda conceals the origins of semi-plausible information with unattributable sources.9 Because strategic competitors typically seek positive global opinions, the use of black propaganda is counterproductive; it is easily invalidated. Gray propaganda is better suited to delivering the desired perceptual effects; it is difficult to disprove.10 Strategic competitors have therefore invested heavily in social and mass media outlets to extend their strategic communications reach to broader audiences.

Investments in information technology alone are not sufficient to destabilize liberal democratic systems. The psychological efficacy of information is the most critical aspect of realizing a strategic vision. Thus, successful information operations stimulate human behaviors toward desired perceptual objectives.11 Understanding how exploiters manipulate complex human perceptual processes is fundamental when designing counterpropaganda operations to protect societies from malign information campaigns.

Manipulating Perceptual Constructs

Many information experts concur that strategic competitors are pushing societal perceptions toward the center of conflict.12 Yet few publications provide explanations on how competitors can successfully leverage perceptual manipulation to achieve political objectives. Dennis Murphy and Daniel Kuehl touched on cognitive dissonance theory as means for “seeking a synergistic balance between securing connectivity and exploiting content to achieve cognitive dissonance leading to behavioral change.”13 But they offered no insights into how cognitive dissonance can be leveraged to spark desired behavioral change within whole societies. Incomplete literature on perceptual manipulation led communications expert Jess Nerren to advocate for renewed investigation into the theory; she writes that “the rise of fake news and the drive for greater media literacy” have opened new opportunities to explore “cognitive dissonance and [its] effects on behaviors.”14 Therefore, the present article discusses how cognitive dissonance theory, which is noted for its scientific reliability in terms of explaining behavioral change, is a good starting point for exploring how manipulators can create gray propaganda that achieves its intended strategic effects.

Figure 2. Cognitive Dissonance Model

Cognitions are ideas, attitudes, and beliefs that form the constructs of human perception.15 Cognitive theory holds that people strive to maintain coherence between cognitions. Inconsistent cognitions initiate anxiety, which causes a person to rebalance cognitions and thereby relieve internal tension.16 Studies on dissonance show that even simple inconsistencies, such as failure to signal when changing lanes in busy traffic, can induce discomfort that a person must harmonize.17 The reduction mechanisms available for people to diminish dissonance (combined below with examples from the lane-changing situation) include (see figure 2):18

  • terminating inconsistent cognitions (always signal when changing lanes)
  • changing original cognitions to match new cognitions (never signal when changing lanes)
  • trivializing cognitions (others do not signal when changing lanes)
  • considering new factors to balance cognitions (removing hands from the steering wheel to signal can jeopardize vehicular control).

It is important to note that the dissonance reduction mechanisms available to humans are subconscious processes. Innate limitations on self-awareness make humans extremely susceptible to manipulation. The complex psychodynamic processes that humans employ to diminish dissonance provide propagandists several avenues by which to steer perceptions toward the center of conflict.

Understanding the power that beliefs hold over one’s psyche—and how reduction mechanisms are susceptible to manipulation—is critical. The persistent effects of beliefs on human perception are so influential that they cause people to automatically dismiss counterinformation,19 and human preference for being right activates heuristics that bar critical thinking. These cognitive barriers lead to biases that focus efforts on identifying evidence that only supports one’s own conclusions. Exploiters take advantage of these human tendencies to fabricate propaganda that influences people to not consider even more plausible explanations for events.

Anxiety’s principal role in cognitive dissonance places emotions as the fundamental force behind perceptual change.20 Strong emotions are difficult to ignore, whereas weaker emotions quickly subside. Studies of online content “virality” discovered that anger- or fear-inducing narratives travel faster, reach more audiences, and persist longer than positive-arousing narratives.21 Exploiters harness prolonged periods of strong negative emotions to change people’s cognitions to the benefit of the antagonist’s agenda. Recall the furor described in the opening vignette when a horrifying rumor caused Lithuanian citizens to disfavor a recently heralded national security policy.

Though humans dislike anxiety, people regularly commit behaviors dissonant with their stated beliefs. A recent study found that peer group social norms and locus of control are powerful psychosocial constructs that allow people to commit dissonant behaviors without feeling guilty.22 Those who exhibit high external locus of control are more likely to assign blame to others for their own actions. Additionally, a person is more likely to perform dissonant acts that conform to a peer group’s social norms. Exploiters manufacture peer group environments that influence people to trivialize inconsistent cognitions and commit dissonant behaviors that advance the exploiters’ malign agendas. Furthermore, manufactured environments that assign scapegoats for peer groups to blame as the cause of their behaviors are exponentially more effective at instigating people to trivialize inconsistent cognitions.

Understanding the framework on which propagandists create environments that stimulate human affect via dissonance reduction manipulation enriches our understanding of how entire societies may be influenced to commit self-destructive behaviors. The good news is dissonance reduction manipulation alone will not permit exploiters to create narratives that sell to mass audiences. Exploiters must adeptly combine the art of persuasion with cognitive dissonance and other interrelated psychosocial constructs to develop gray propaganda that propels behaviors toward their desired objectives.

Army Soldiers conduct leaflet drop in several villages surrounding Hawijah

Attracting Strategic Audiences

Joseph Nye argued that government-controlled information cannot deliver desired strategic effects because its disingenuousness makes it unattractive to broad audiences. He reinforced this perspective by arguing that Chinese attempts to charm international audiences have produced limited returns.23 As stated above, gray propaganda is not necessarily entirely untruthful; it is semi-plausible. However, recent changes in longstanding geopolitical alignments, such as Asia’s Regional Comprehensive Economic Partnership’s invitation for Chinese Belt and Road extension beyond the nine-dash line, suggest that asymmetric narratives such as China’s can affect global audiences.

Robert Cialdini calls asymmetric narratives that deliberately mislead a recipient’s behavior toward the benefit of the sender weapons of influence and asserts that weapons of influence are so persuasive that it is difficult for people to resist their attractive power. Cialdini notes that creating weapons of influence is simple because they require only psychological triggers to propel human behaviors toward intended perceptual objectives.24 It is at this juncture that psychodynamic constructs become useful tools for propagandists. When injected into messages that grab attention, are simple to understand, and resonate with the receiver, manipulated dissonance reduction mechanisms constitute weapons of influence that persuade even the most skeptical consumers of information.25

However, manipulating the opinions of whole societies requires exploiters to design narratives that conform with targeted audience cultural and linguistic frames. Commonly held ideas passed through the generations guide societal behaviors; it is impossible to create a one-size-fits-all narrative that can corral a unitary perspective on an issue.26 Attempts to do so can result in targeted audiences forming interpretations that conflict with the sender’s intent; this variable makes it difficult for propagandists to calculate whether audiences will form desired perceptions. However, as Clausewitz notes, people’s passion can cause societal forces to act contrary to rational cultural norms.27 Manipulated dissonance reduction mechanisms imbedded within culturally relevant narratives create psychological triggers that can thrust irrational societal tendencies to the forefront. These dynamics make it possible for societies to fall victim to gray propaganda.

Figure 3. Strategic Actor Model

To be successful, adversaries tailor gray propaganda toward aligned, neutral, and opposed actors who revolve around distinct perceptual centers of gravity (see figure 3).28 Aligned actors champion the adversary’s foreign policies; propagandists propel these actors toward perceptual objectives that advance their security agenda. Since neutral actors have geopolitical alternatives, propagandists exert more energy to propel them toward perceptual objectives that expand their security agenda. Whereas aligned and neutral actor orbits tend to act as if propelled by centripetal force, opposed actor disagreement acts as if propelled by centrifugal force against the adversary’s perceptual center of gravity. Propagandists apply pressure to propel opposed actor perceptions toward increased ambivalence.

Psychological distance is more important than physical distance when classifying strategic actors. For example, though the Baltic nations share physical borders with Russia, they oppose the Kremlin’s foreign policies. Wary of NATO’s response when Russia is engaging opposed Western actors, Valery Gerasimov, chief of the general staff of the Russian military, acknowledged that modern information networks provided asymmetric advantages that can create permanent “long-distance, contactless actions” within opposing states.29 Accordingly, exploration into how the Kremlin competes for strategic cognitive terrain within Russia’s near abroad provides military analysts with a model to examine how adversaries employ psychological capitulation strategies.

Three Air Force F-22 Raptor aircraft fly alongside Air Force KC-135 Stratotanker aircraft over Poland

Russia’s Fight for Strategic Cognitive Terrain

Geography shapes Russian perspectives on national security. Fears caused by numerous invasions30 have etched an extreme paranoia of external powers in, to use the term of psychoanalyst Carl Jung, the collective unconscious of the Russian psyche. Maintaining a zone of influence along its borders therefore dominates the Kremlin’s strategic culture. NATO’s enlargement, as well as perceived U.S. backing of color revolutions in Georgia (2003), Ukraine (2004), and Kyrgyzstan (2005), has created the belief that an arc of crisis exists around Russia.31 These beliefs intensify Russian paranoia and heighten desires to expand security zones.

Russian president Vladimir Putin aspires to stabilize the arc of crisis. Putin’s “sovereign democratic” construct is therefore specifically designed to counter the West’s encroachment in Russia’s near abroad. Putin’s sovereign democratic structure envisions the amalgamation of friendly neighbors who exercise complete control over their economies and maintain strong militaries to oppose liberal democratic influence.32 Putin’s goals are to secure Russia’s borders and fracture NATO. However, certain that NATO will honor pledges to defend its members, Putin prefers indirect approaches over direct military confrontation. The Russian military’s initiation of the so-called “special military operation” in Ukraine on February 24, 2022, exemplifies the Kremlin’s operationalization of Putin’s vision. The invasion of Ukraine ultimately seeks to secure what Putin perceives as the most vulnerable region for continued NATO encroachment along his near abroad, while simultaneously employing information campaigns that test NATO’s unity and the West’s will to resist Russian security objectives.

In 2013, Gerasimov challenged state apparatuses to not only learn the lessons of the nontraditional military means employed during the Arab Spring and the so-called color revolutions, but also to get ahead of the curve and figure out how the Russian military can apply them. Chief among his thoughts was the use of information warfare to reduce the combat potential of superior forces.33 Gerasimov’s thoughts on 21st-century warfare prompted the Russian General Staff to discover indirect approaches that place human perception at the center of gravity and open societal fault lines that turn liberal democratic norms and institutions against themselves.34

Two prominent examples—the Russian onslaught of gray propaganda that widened preexisting Ukrainian societal fissures to set conditions for the annexation of Crimea, and the introduction of “little green men” in the Donbas following Kyiv’s 2013 Euromaidan demonstration—highlight the Kremlin’s growing expertise at manipulating perceptions. Though this military intervention dampened Ukraine’s budding relationship with the European Union, the Kremlin realized that it cannot achieve Putin’s revanchist aims while NATO remains in its near abroad. Thus, lessons from Russia’s 2014 intervention in the Donbas most likely led to Putin’s 2022 decision to invade Ukraine, thereby permanently removing it from NATO’s influence, while also continuing to employ gray propaganda against opposed Western strategic actors to secure territorial gains.

President-elect Donald J. Trump stands on platform of Capitol during 58th Presidential Inauguration in Washington, DC

Putin must retain aligned actor support to counter further liberal democratic encroachment within Russian zones of influence. Continuous news coverage of U.S. activities in the Balkans and Central Asia reinforces domestic audience biases that the United States surrounds Russia to retain global hegemony. The Kremlin points to the nearly $500 billion annual discrepancy between U.S. and Russian defense spending to reinforce beliefs of the U.S. resolve to contain Russia.35 Additionally, reminders of how NATO took advantage of Russia’s weakness following the Soviet Union’s fall have stimulated strong negative emotions that affect Russian society’s inconsistent cognitions between authoritarianism and liberal democracy. Finally, Putin’s nonstop assertions that Russia is ultimately fighting U.S.-backed Western proxies during the war in Ukraine illustrates the Kremlin’s current use of propaganda to retain aligned actors.

Russia aggressively pursues neutral actor movement toward sovereign democratic architectures to balance against the West. The Kremlin exploits pan-Slavic identities in the Commonwealth of Independent States to tightly couple neighboring nations with Russia. Kremlin-funded language, youth education, and Russian Orthodox Church programs create “vertically integrated propaganda networks” that stretch across Eurasia.36 Constant depictions of Western aggression against Serbia, Libya, Syria, and Afghanistan have incited perceptions of liberal democratic conspiracies to destabilize non-Western states and have nurtured confirmation biases that a resurgent Russia is needed to counter the United States.

Russian gray propaganda fosters opposed actor ambivalence toward its efforts to undermine the West’s collective capacity to refute Putin’s foreign policy agenda. The Kremlin masterfully exploited the 2015–2016 refugee crisis to swell fear throughout the European community.37 The European Union’s insistence that members maintain open borders caused a crisis of solidarity among national leaders. The crisis spurred the rise of populist governments in NATO members Poland, the Czech Republic, Hungary, and Turkey. Prime Minister Viktor Orban’s advocacy of sovereign democracy as “a new model of governance for Hungary to follow” illustrates the success of Russian gray propaganda in cultivating strong negative emotions that led to societies’ questioning of whether liberal governments could provide security.38

The United States is not immune to Russian manipulation. Avalanches of Kremlin gray propaganda during the 2016 U.S. national elections in an influence campaign intended to make voters trivialize inconsistent cognitions between liberal and populist agendas. A 2017 intelligence community assessment found that Putin personally initiated the information campaign preferencing Donald Trump’s election.39 Russian state-sponsored news outlet Russia Today (RT) broadcast hundreds of pro-Trump news stories to nearly 85 million American viewers. RT-produced pro-Trump YouTube videos received nearly 1 million more views per day than pro–Hillary Clinton advertisements. Moreover, the assessment concluded that Russian trolls created more than 50,000 Facebook and 400,000 Twitter accounts whose daily pro-Trump posts were shared millions of times.40

When asked why Putin would prefer him in the Oval Office, Trump responded, “Because I’m a great guy.”41 Or did Putin simply aid the candidate who claimed that NATO was obsolete in getting elected to the U.S. Presidency? Though a 2020 survey’s finding that most people considered Putin more trustworthy than Trump indicates that the United States is losing the cognitive fight, the following recommendations discuss ways to win the battle.42

U.S. Servicemember and Slovak soldier discuss tactics during NATO exercise Strong Cohesion 2022

Recommendations and Conclusions

Information experts routinely advocate for increased intellectual property protection, election hardening, and education of citizens to identify “fake news” as ways to protect the United States against asymmetric narratives.43 These proposals require legislative measures that do not leverage military capabilities to defend the Nation against perceptual manipulation. Politicians must also enact laws that allow DOD to incorporate the psychosocial methods discussed throughout this article into developing global campaign plans to counter gray propaganda.

The U.S. military should codify a cognitive warfighting domain. Current joint doctrine emphasizes understanding information’s pervasiveness to determine effects on relevant actors and military operations.44 However, Joint Publication 3-13, Information Operations, does not discuss how to shape target audience perceptions for desired strategic effects. Thus, the Joint Staff should formalize the cognitive warfighting domain to provide the military enterprise with the ways and means to prevail on the cognitive battlefield. This recommendation does not advocate for the creation of another combatant command but is intended to encourage the Joint Staff to consider reflagging U.S. Cyber Command and consolidating cyber, electronic warfare, military information support operations, civil affairs, and all other joint information functions under a U.S. Cognitive Dominance Command.45 Furthermore, this recommendation is not intended to replace cyber operations with information operations. Rather, it is intended to emplace the entire information spectrum as the joint warfighting integrator when competing for highly contested strategic cognitive terrain.

The U.S. military should also institute occupational specialists trained to scour the Web and social media platforms for gray propaganda. These “Cyber Scouts” would surveil gray zone social spaces where trolls lurk. Their reconnaissance objective would be the identification of asymmetric narratives requiring immediate refutation. Armed with artificial intelligence (AI) algorithms, Cyber Scouts could work with foreign agents operating within the virtual battlefield. AI data could then be fed to joint targeting operations that would expose and abolish troll farms, “sock puppets,” and other exploiters as part of dismantling networks that propagate gray propaganda.

Updating Murphy and Kuehl’s “3C” information power model of connectivity, content, and cognition to include “compete” and “comprehend” will assist military planners with operationalizing counterpropaganda plans.46 Competition prioritizes getting it right over being right. Lessons learned from the 2008 Russia-Georgia war found that clarity and consistency are more important than micromanaging messages in a 24/7 news cycle.47 The contrast principle holds that initiating messages are more persuasive than responding messages.48 Joint commands should therefore adhere to that principle and broadcast messages that immediately control narratives.

The downing of Malaysian flight MH17 over eastern Ukraine in 2014 highlights the importance of the contrast principle. Anticipating blowback, the Kremlin immediately blamed Ukraine for shooting down MH17. By the time investigators had proved that a Russian-supplied surface-to-air missile had downed the airliner, the news cycle had already moved on to other headlines. Thus, staff fighting for cognitive terrain should not waste time responding to every piece of mis- and disinformation; their sheer volume prevents it. They must instead immediately provide commanders with clear statements when fleeting opportunities arise to erode an adversary’s credibility.

Joint commands should incorporate professionals who are fluent in target audience cultural frames. Linguistics, anthropological, and other cultural experts will enhance the planning staff’s ability to determine what may resonate with specific populations. Staff can use cultural frames consisting of rituals, symbols, and legends to develop a society’s “collective unconscious profile.” Consider the “Century of Humiliation” as it pertains to China’s collective unconscious and how it influences the country’s fervor for supplanting U.S. hegemony in the Pacific. Collective unconscious profiles would help planners harvest narrative potential for targeted audiences.

Joint commands should incorporate psychologists and sociologists to turn collective unconscious profiles into persuasive content. Planners could also leverage graphic artists and advertising specialists to transform messages into influential memes and videos that would immediately grab the receiver’s attention, be simple to understand, and resonate. Military planners would need to share proposed themes and messages with U.S. Embassy public affairs offices in strategic actor nations to gain concurrence on unified messaging approaches. This step would ensure that the right message went to the right audience at the right time.

Joint commands must increase their connectivity to mainstream communications to reach target audiences. Collaborating with preexisting partners would be an inexpensive way to increase capacity. For example, European Combatant Command planners could collaborate with the NATO Strategic Communication Center of Excellence to exploit popular social media platforms. Planners could also leverage Special Operation Command’s WebOps experts to develop influential memes and videos to refute gray propaganda.

Cognition is where the human mind comprehends information. Successful information operations must stimulate human affect toward intended perceptual objectives. Psychologists can provide dissonance reduction approaches for inclusion within culturally framed messages to produce desired perceptual effects. Planners can use connectivity capabilities to collect the numbers of retweets, shares, and likes to measure message proliferation, persistence, and strategic actor responses. The most important measure is the shrinking of malign actor presence within the strategic cognitive terrain.

Combining cutting-edge communications with psychosocial science to employ psychological capitulation strategies has changed the character of modern war. Adversaries combine half-truths with psychodynamic behavioral constructs to compete for strategic cognitive terrain. The U.S. military currently lacks the authorizations and capabilities required to protect societies against gray propaganda. Peter Singer and Emerson Brooking quoted an unattributed U.S. Army officer as saying, “Today we go in with the assumption that we’ll lose the battle of the narrative.”49 The United States can no longer accept loss in the information fight. As Dennis Murphy and James White cautioned, “Failure to . . . react to propaganda cedes the international information environment to the enemy”50 and allows adversaries to continuously outflank us on the cognitive battlefield. JFQ

Notes

1 Valery Gerasimov, “The Value of Science Is in the Foresight: New Challenges Require Rethinking the Forms and Methods of Carrying Out Combat Operations,” trans. Robert Coalson, Military Review, January–February 2016 (originally published in Military-Industrial Kurier), February 27, 2013, available at <https://www.armyupress.army.mil/Portals/7/military-review/Archives/English/MilitaryReview_20160228_art008.pdf>.

2 Hannes Heine, “Fighting ‘Fake News’ Online: How Soldiers in Latvia Got Fooled by Bots,” Der Tagesspiegel, October 2, 2019, available at <https://www.euractiv.com/section/eastern-europe/news/fighting-fake-news-online-how-soldiers-in-latvia-got-fooled-by-bots/>.

3 Dmitry Adamsky, “From Moscow with Coercion: Russian Deterrence Theory and Strategic Culture,” Journal of Strategic Studies 41, no. 1–2 (2018), 45.

4 Carl von Clausewitz, On War, ed. and trans. by Michael Howard and Peter Paret (New York: Everyman’s Library, 1993), 91.

5 Dennis Murphy and Daniel Kuehl, “The Case for a National Information Strategy,” Military Review, September–October 2015, available at <https://www.armyupress.army.mil/Portals/7/military-review/Archives/English/MilitaryReview_20151031_art013.pdf>; Donald Bishop, “Elements of U.S. Informational Power,” lecture to Joint Advanced Warfighting class, Joint Forces Staff College, Norfolk, Virginia, October 11, 2019.

6 Joseph S. Nye, Jr., “What China and Russia Don’t Get About Soft Power,” Foreign Policy, April 29, 2013, available at <http://foreignpolicy.com/2013/04/29/what-china-and-russia-dont-get-about-soft-power/>.

7 Dennis M. Murphy and James F. White, “Propaganda: Can a Word Decide a War?” Parameters 37, no. 3 (Autumn 2007), 15, available at <https://press.armywarcollege.edu/parameters/vol37/iss3/23/>.

8 Bishop, “Elements of U.S. Informational Power.”

9 Truda Gray and Brian Martin, “Backfires: White, Black, and Grey,” Journal of Information Warfare 6, no. 1 (2007), 7–16, available at <https://www.bmartin.cc/pubs/07jiw.html>.

10 Bishop, “Elements of U.S. Informational Power.”

11 Robert Cialdini, Influence: The Psychology of Persuasion (New York: Harper Business, 2007), 11.

12 Adamsky, “From Moscow with Coercion”; Bishop, “Elements of U.S. Informational Power”; Murphy and Kuehl, “The Case for a National Information Strategy”; Christopher Paul and Miriam Matthews, The Russian “Firehose of Falsehood” Propaganda Model: Why It Might Work and Options to Counter It (Santa Monica, CA: RAND, 2016), available at <https://www.rand.org/pubs/perspectives/PE198.html>; Peter Singer and Emerson Brooking, LikeWar: The Weaponization of Social Media (New York: Houghton Mifflin Harcourt, 2018).

13 Murphy and Kuehl, “The Case for a National Information Strategy,” 73.

14 Jess Block Nerren, “Civic Engagement, Fake News and the Path Forward,” Journalism and Mass Communication 8, no. 2 (February 2018), 51, available at <https://www.researchgate.net/publication/327793603_Civic_Engagement_Fake_News_and_the_Path_Forward_Peer_Reviewed_Journal_Article_in_Journalism_and_Mass_Communication/link/5efe737ca6fdcc4ca4474d67/download>.

15 Leon Festinger, A Theory of Cognitive Dissonance (Stanford, CA: Stanford University Press, 1957), 9.

16 Ibid., 17.

17 Nicholas Levy, Cindy Harmon-Jones, and Eddie Harmon-Jones, “Dissonance and Discomfort: Does a Simple Cognitive Inconsistency Evoke a Negative Affective State?” Motivation Science 4, no. 2 (September 2017), 95–108.

18 Festinger, A Theory of Cognitive Dissonance, 18.

19 Sebastian Cancino-Montecinos, Fredrik Björklund, and Torun Lindholm, “Dissonance and Abstraction: Cognitive Conflict Leads to Higher Level of Construal,” European Journal of Social Psychology 48, no. 1 (May 2017), 100–107.

20 Sebastian Cancino-Montecinos, Fredrik Björklundt, and Torun Lindholm, “Dissonance Reduction as Emotion Regulation: Attitude Change Is Related to Positive Emotions in the Induced Compliance Paradigm,” PLoS One 13, no. 12 (December 2018), 3, available at <https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6296533/>.

21 Jonah Berger and Katherine L. Milkman, “What Makes Online Content Viral?” Journal of Marketing Research 49, no. 2 (April 2012), 199, available at <https://jonahberger.com/wp-content/uploads/2013/02/ViralityB.pdf>.

22 Jason Stephens, “How to Cheat and Not Feel Guilty: Cognitive Dissonance and Its Amelioration in the Domain of Academic Dishonesty,” Theory Into Practice 56, no. 11 (March 2017), 1–10. Locus of control is the degree to which people believe that external forces have control over event outcomes. Attributions are assigned causes for behaviors.

23 Nye, “What China and Russia Don’t Get About Soft Power.”

24 Cialdini, Influence.

25 Singer and Brooking, LikeWar, 159–160.

26 Jahara W. Matisek, “Shades of Gray Deterrence: Issues of Fighting in the Gray Zone,” Journal of Strategic Security 10, no. 3 (2017), 13, available at <https://digitalcommons.usf.edu/cgi/viewcontent.cgi?article=1589&context=jss>.

27 Clausewitz, On War, 91. Insights into how people’s passion makes societies susceptible to dissonance reduction manipulation were developed with public affairs expert Colonel Elizabeth Mathias, Ph.D., USAF.

28 Andrew Chisholm, “Disrupt, Coerce, Legitimize, Attract: The Four Dimensions of Russian Smart Power” (thesis, Joint Advanced Warfighting School, June 27, 2018), 5.

29 Gerasimov, “The Value of Science Is in the Foresight,” 24.

30 Norbert Eitelhuber, “The Russian Bear: Russian Strategic Culture and What It Implies for the West,” Connections 9, no. 1 (Winter 2009), 5.

31 Ibid., 11.

32 Ibid., 13.

33 Gerasimov, “The Value of Science Is in the Foresight,” 27.

34 Mark Galeotti, “I’m Sorry for Creating the ‘Gerasimov Doctrine,’” Foreign Policy, March 5, 2018, available at <https://foreignpolicy.com/2018/03/05/im-sorry-for-creating-the-gerasimov-doctrine>.

35 Ray Finch, “How the Russian Media Portrays the U.S. Military,” Military Review 99, no. 4 (July–August 2019), 92, available at <https://www.armyupress.army.mil/Journals/Military-Review/English-Edition-Archives/July-August-2019/Finch-Russian-media/>.

36 Antoaneta Dimitrova et al., The Elements of Russia’s Soft Power: Channels, Tools, and Actors Promoting Russian Influence in the Eastern Partnership Countries, Working Paper Series No. 4 (Berlin: EU-STRAT, August 2017), 17, available at <https://ec.europa.eu/research/participants/documents/downloadPublic?documentIds=080166e5b53ce9da&appId=PPGMS>.

37 Singer and Brooking, LikeWar, 207.

38 Dimitrova et al., The Elements of Russia’s Soft Power, 15.

39 Assessing Russian Activities and Intentions in Recent U.S. Elections (Washington, DC: Office of the Director of National Intelligence, January 6, 2017), 1, available at <https://assets.documentcloud.org/documents/3719492/Read-the-declassified-report-on-Russian.pdf>.

40 Singer and Brooking, LikeWar, 112–113.

41 Amy Cheng and Humza Jilani, “Trump on Putin: The U.S. President’s Views, In His Own Words,” Foreign Policy, July 18, 2018, available at <https://foreignpolicy.com/2018/07/18/trump-on-putin-the-u-s-president-in-his-own-words/>.

42 Dante Chinni, “Foreign Leaders Top Trump on Trust, New Survey Finds,” NBC News, January 12, 2020, available at <https://www.nbcnews.com/politics/meet-the-press/putin-trudeau-merkel-top-trump-trust-internationally-new-survey-finds-n1114151>.

43 Todd Schmidt, “The Missing Domain of War: Achieving Cognitive Overmatch on Tomorrow’s Battlefield,” Modern War Institute, April 7, 2020, available at <https://mwi.usma.edu/missing-domain-war-achieving-cognitive-overmatch-tomorrows-battlefield/2020>; Nye, “What China and Russia Don’t Get About Soft Power”; Bishop, “Elements of U.S. Informational Power”; Singer and Brooking, LikeWar; Matisek, “Shades of Gray Deterrence”; Paul and Matthews, The Russian “Firehose of Falsehood” Propaganda Model.

44 Joint Publication 1, Doctrine for the Armed Forces of the United States (Washington, DC: The Joint Staff, March 25, 2013, Incorporating Change 1, July 12, 2017), I-19, available at <https://irp.fas.org/doddir/dod/jp1.pdf>.

45 Schmidt, “The Missing Domain of War.”

46 Murphy and Kuehl, “The Case for a National Information Strategy,” 72.

47 Svante E. Cornell and S. Frederick Starr, The Guns of August 2008: Russia’s War in Georgia (Oxford: Routledge, 2009), 195.

48 Cialdini, Influence, 12.

49 Singer and Brooking, LikeWar.

50 Murphy and White, “Propaganda,” 24.