Insights
Blind Sided: A New Playbook for Information Operations

Blind Sided: A New Playbook for Information Operations

BG Christopher M Burns USA (Ret.)
Lori Leffler
Varsha Koduvayur

Download a PDF of this publication by clicking the icon.

Download a PDF of this publication by clicking the icon to the left.

Last summer, a coordinated campaign by users on Facebook and Twitter targeted the Australian company Lynas. In 2021, Lynas—the largest rare earths mining and processing company outside China—finalized a deal with the U.S. Department of Defense to build a processing facility for rare earth elements in Texas. A year later, numerous concerned Texas residents began to criticize the deal on social media, claiming that Lynas’s facility would create pollution, lead to toxic waste dumping, and harm the local population’s health. Their posts also denigrated Lynas’s environmental record and called for protests against the construction of this facility and a boycott of the company. 

Except the posts were not written by Texas residents—nor even by real people. The vast majority of posts came from fake accounts that the People’s Republic of China (PRC) created and maintained as part of an influence operation. The PRC’s goal was to do reputational damage to one of the main threats to its dominance in the geopolitically important rare earths sector. Rare earth metals are critical to producing a range of technologies, including semiconductors, batteries, cell phones, electric vehicles, renewable energy systems, and missiles. China has a veritable chokehold on the global market for rare earth elements, controlling almost 90 percent of their production—an edge that Beijing is eager to maintain. More recently, PRC information operations  (IOs) took aim at the U.S.’s midterm elections, with posts that disparaged certain senators and spread disinformation about politically-motivated violence.

Sample of posts created by PRC against Lynas. Source: Mandiant.

Of course, China is not the only foreign state engaged in targeting the American public through the information space. Other countries, such as Russia and Iran, are also known to carry out significant information operations. Nor are information operations new phenomena. During the Cold War, for example, the Soviet Union deployed what it dubbed “active measures” against the U.S. in an attempt to promote communism, discredit the Western liberal order, and redirect Western criticism of the Soviet Union. As Mark Galeotti notes, the main department responsible for active measures within the KGB was originally named “Service D,” a reference to disinformation (dezinformatsiya). Indeed, going back further in history, we can see that information operations are an ancient art. They are referenced in Sun Tzu’s The Art of War, and even ancient Egyptians deployed information operations. Ramesses the Great used temple carvings to communicate to his population that he had crushed the Hittite forces at the Battle of Kadesh, resulting in the Levant being subjected to Egyptian rule. His claims were grossly exaggerated. The result was in fact “inconclusive” at best, and both sides sustained high losses, a reality that Ramesses appeared to have accepted in private.

Today’s threat environment gives renewed importance to information operations. Whether we call it gray zone competition, strategic competition, or irregular warfare, the contemporary threat environment can be generally characterized by state (or sub-state) conflict that falls below the threshold of open war. For U.S. adversaries, information operations have been essential in allowing them to compete with the U.S. in this liminal space without direct physical confrontation. 

To carry out information operations in the gray zone, actors rely on and manipulate psychological biases, social and political structures, and emerging technologies. And today’s modern media ecosystem—which combines traditional, digital, and social media—provides multiple vectors for our adversaries to deploy information operations.

A forthcoming report from the Irregular Warfare Center, titled Blind Sided: A Reconceptualization of the Role of Emerging Technologies in Shaping Information Operations in the Gray Zone, provides a new approach to assessing such information operations. The report advances the propagation-mobilization framework for analyzing gray zone IOs. This framework maintains that two primary components determine the success of most contemporary information operations: 1) propagation of information, and 2) mobilization to action. Information operations can have one of two impacts upon these two lines of effort: IOs can either be amplifying or suppressive on the propagation and mobilization. In other words, an IO can either amplify or suppress propagation of information, and similarly amplify or suppress mobilization to action—sometimes simultaneously.

Amplifying propagation is intended to rapidly push and spread information through the media ecosystem, such as when something goes “viral.” Suppressive propagation is when information is taken out of circulation, whether that occurs through content and account removals on social media platforms, censorship and self-censorship, or online harassment resulting in silencing of information, to name a few examples.

Amplification of mobilization is one of the classic goals of information operations. Amplifying mobilization is when a target population goes out and commits an action, often influenced by propagation of information, be it amplifying or suppressive. Such actions include, for example protesting and marching in response to social justice issues, sewing face masks for medical care providers during COVID-19, or storming the U.S. Capitol.  On the flipside, suppressive mobilization seeks to prevent a population from doing something. Examples of suppressive mobilization include convincing people to stay home instead of going to vote, or to refrain from taking the COVID-19 vaccine, or to decline to join a gang.  

The modern media ecosystem has drastically shortened the time it takes for propagation to impact mobilization—a sea change compared to the pace of information operations in the past. The reasons for this shrinking gap are mostly technological, but also cultural, with mobilization, action, and activism coloring our zeitgeist. It is much easier today for a wide range of actors, both benign and malign, to mobilize audiences, both at a distance and at scale. The national security consequences of this shortened timespan should be evident. Americans may mobilize with actions and demands well before the facts on the ground are ascertained, posing challenges for public leaders or law enforcement personnel coordinating a response.

The propagation-mobilization framework may seem simple, but it has revolutionary implications by providing an enduring schema that can be applied to any actor’s information operations. Thus, the framework de-emphasizes the centrality of specific actors to a particular IO. The dominant discourse on information operations is frequently actor-centric, assessing information operations through the lens of what individual actors carry out (i.e. PRC influence operations or Russia’s information operations during the 2016 U.S. election). But such a lens is, ultimately, reactive: the world waits for news about a new information operation to surface, and then pounces upon that IO to dissect who is behind it and what the actor’s tactics were. The propagation-mobilization framework, in contrast, offers a unifying and overarching blueprint that has the potential to help practitioners anticipate future IOs before they occur.

Moreover, the framework allows for a conceptualization of information operations that avoids being solely limited to mis/disinformation. Certainly, mis/disinformation is an element of information operations, but IOs comprise more elements than just mis/disinformation. Mis/disinformation is one tool, among many, that can make up an IO actor’s playbook.

Moreover, information and mis/disinformation are not a binary. Indeed, for various reasons, there has been significant truth decay in the mainstream media environment, which historically has been seen as a purveyor of accurate information. Americans are consistently reporting their growing distrust of mainstream media outlets. The media environment has undergone significant changes within the last two decades, including increased sensationalism, increased partisanship, and a rush to publish that is underpinned by the always-on, 24/7 news cycle. As the mainstream information environment’s credibility wanes, its ability to counter mis/disinformation fades in tandem.

The report makes two other groundbreaking contributions. First, the report adapts evolutionary theory to the application of information operations in the gray zone. Evolutionary theory has been used in certain contexts in the social sciences, and it appears to have unique value in the context of information operations. Second, the paper outlines a robust range of information operation elements and tools, both digital and offline, and in doing so offers a compendium or playbook of elements that IO actors may deploy.

The report uses IOs’ ends, ways, and means as a framework for better understanding them according to the playbook that it provides. By ends, we refer to the outcome that IO actors seek to achieve: why the actor is propagating information and/or mobilizing action in the first place. Ultimately, all information operations have a desired end, be it political, economic, diplomatic, social, etc. By ways, we indicate how IO actors seek to achieve their preferred end through amplifying or suppressing propagation and/or mobilization. And by means, we refer to the specific information operation that the IO actor employs—in other words, the specific play that they select from the IO playbook that the Blind Sided report presents. IO actors have a number of tools—which we also refer to as elements, aspects, or components of their information operations—at their disposal to conduct the IO. These tools provide the methods, capabilities, and resources for conducting information operations.

The elements and tools of propagating information and mobilizing action are shared and utilized by a wide range of actors. It is not simply that our adversaries learn from their own mistakes and from the mistakes of other adversaries—they are likely surveying the whole spectrum of information operations tools available, and new tools or IOs may evolve from older tools deployed in either beneficial or harmful contexts. Our report shows, crucially, how these aspects evolve—they are not simply static, unchanging tools that IO actors deploy. Different and seemingly disparate elements of IOs can interact, leading to the creation of new tools and elements.[i] Additionally, components of IOs deployed in one context can be used in its opposite context: tools used in a benign or even beneficial manner can be coopted by adversaries and deployed to have a malign or harmful impact, and vice versa.[ii] For national security practitioners, this nuance is key. It will not be enough to only pay attention to what the “bad guys” do.

In praxis, how should this framework be synthesized when assessing information operations? Let us return to the case of Lynas. That information operation, which cybersecurity firm Mandiant identified and named DRAGONBRIDGE, was the means by which China sought to damage Lynas. The way, or lines of effort, was both propagation of information—promoting “narratives in support of the political interests of” Beijing—and attempts at mobilizing the public in response. The PRC used a range of technological tools and methods to carry out its IO, which included setting up fake social media accounts meant to appear real to users and spreading disinformation about the company. The end was to besmirch the reputations of and ultimately thwart gains made by competitors to China in rare earths, a sector that is important to China’s geopolitical leverage” and strategic positioning.

Interestingly, Lynas was not the only victim. The PRC’s information operation also targeted Appia Rare Earths & Uranium Corp, a Canadian rare earths mining firm, and American rare earths producer USA Rare Earth. All the inorganic social media posts, and the fake accounts that created and spread them, were set up after each company announced new deals or discoveries: Lynas signed a deal with the Department of Defense, Appia announced that it had discovered a new rare earths area in Canada, and USA Rare announced that it would build a rare earths processing plant in Oklahoma.

Ultimately, the Blind Sided report aims to create a new playbook for understanding information operations in the gray zone, pushing past the actor-centric, reactive analyses that to date have largely colored our understanding of IOs. In doing so, we aim to enrich the current understanding of IOs.

The policy implications are manifold. First, policymakers must ensure that the U.S. is effective in developing IO-informed policies. Our national security policies must adequately reflect the complexities and nuances of information operations. Developing a forward-leaning, whole-of-government response to our adversaries’ information operations is essential to more effectively fight our enemies at their own information game. 

Second, policymakers should consider adopting a more offensive posture to counter IOs. There is no reason to limit the framework to a defensive application only. Certainly, Washington needs to understand IOs better to be able to effectively counter our adversaries’ propagation and/or mobilization. But we can also use the propagation and mobilization framework for our own purposes and ends. By turning the playbook back on them, we can riddle our adversaries with the same strategic dilemmas they foist on us.

Third, the U.S. needs policymakers who understand IOs to be able to discern future technological adaptations that could increase malign actors’ abilities to propagate information or mobilize action. No one can predict with certainty what emerging technologies or future technological adaptations can come down the pike. But there is an urgent need for policymakers to be aware of, vocalize, and direct preparation for future IOs that will certainly capitalize on those trends.

Fourth, policymakers must build up internal resiliency against information operations. It is imperative to improve Americans’ digital and media literacy skills and critical thinking skills with targeted programs. Such an endeavor would need to cut across multiple generations to ensure effectiveness. Moreover, as technology improves, IOs will likely become harder to detect and disrupt. As such, policymakers must seriously address the internal conditions that our adversaries’ IOs are likely to exploit. By inoculating the population with more digital and media literacy awareness and critical thinking skills, we can build internal resilience and resistance against manipulation by adversaries.

In this era of irregular warfare, information operations are only likely to grow in frequency, scale, and intensity in future. The U.S. must take bold action to mitigate the challenges and limit the national security risks that our adversaries’ information operations pose—or risk being blindsided time and again.

About the Authors:

Brig. Gen. Chris Burns, U.S. Army, Retired, serves as senior advisor at the Irregular Warfare Center.
Lori Leffler is the chief of staff at the Irregular Warfare Center.
Varsha Koduvayur is a senior analyst at the Irregular Warfare Center.


[i] See Gartenstein-Ross et al., Blind Sided: Information Operations in the Gray Zone, p. 96, which notes: “Indeed, activist groups and social movement participants look to other movements’ actions when developing their own strategies. A recent example of this is the migration of utilizing umbrellas to ward off rubber bullets and smoke canisters, originally used in the Hong Kong pro-democracy protests in 2019, then used in the U.S. in ongoing protests in Portland, Oregon. In Portland, additional elements were developed by activists using everyday items such as garbage cans to make shielding for protection against anti-crowd and dispersal technologies used by the police.”

[ii] Hashtag hijacking is an example, where users on a social media platform coopt or “hijack” an existing popular hashtag to share unrelated content, so as to get that piece of content more viewership. See Gartenstein-Ross et al., Blind Sided: Information Operations in the Gray Zone, pp. 78-79, which notes: “Hashtag hijacking campaigns have been used by activists, advertisers, trolls, and violent non-state actors. Daesh [the Islamic State] performed several high-profile hashtag hijacks in 2014-15, including taking over hashtags associated with the World Cup to circulate propaganda and violent content to a massive worldwide audience. A more recent hashtag hijacking is QAnon’s takeover of the SavetheChildren hashtag. In the summer of 2020, QAnon followers determined that Wayfair’s prices for cabinets and other items were exorbitantly high, and based on an esoteric reading of Wayfair’s website, they circulated theories that Wayfair was in fact trafficking in children.… Over the course of days and weeks, #WayfairGate morphed into a hijack of the SavetheChildren hashtag, which was used regularly by international NGO Save the Children to promote its work and fundraise. Posts flooded social media, spreading rapidly through mothers’ groups. A series of SavetheChildren protest marches were organized across the United States for July and August, from big demonstrations in large cities to smaller protests in rural towns.”