The New Age of Propaganda: Understanding Influence Operations in the Digital Age
Editor’s Note: A version of this article was originally published by The Interpreter, which is published by the Lowy Institute, an independent, nonpartisan think tank based in Sydney. War on the Rocks is proud to be publishing select articles from The Interpreter.
Influence operations in the digital age are not merely propaganda with new tools. They represent an evolved form of manipulation that presents actors with endless possibilities — both benign and malignant. While the origins of this new form are semi-accidental, it has nonetheless opened up opportunities for the manipulation and exploitation of human beings that were previously inaccessible. Now conducted across the whole of society, we are only at the beginning of a new era of population-centric competition.
With regard to propaganda, the fundamental distinction between the old and the new lies in the difference between participatory and passive forms of information consumption.
Participatory Versus Passive
In the digital age, when people post, comment, like, share, and search, we are participating in information processing and knowledge formation in a way we didn’t before. We are actors in our own information consumption, and this represents a subtle but important shift.
In contrast, when reading, watching, or listening to traditional forms of media (print, television, radio), we are largely passive observers. When we made choices about what to believe and what to dismiss, we did so at arm’s length from the subject matter. Traditional propaganda always faced this obstacle — the fact that those subject to it retained enough distance for the possibility of doubt.
In the same way that students learn more effectively by doing, it is the doing part of contemporary digital media that is distinct. Participation is a type of cognitive investment. People engage differently when they are themselves participants in the narrative. They experience the narrative as it is developing; it becomes part of their lived experience. Posting, commenting, tagging, and sharing — they are no longer at arm’s length from the subject matter.
The power of the co-created narrative can also be seen in the rise of Internet gaming. China has formed partnerships with online gaming companies to proliferate its ideology, which tells us of the game’s persuasive powers and strategic utility. Equally too, in a depersonalized gaming environment, the human consequences are not as evident. In a game-like environment, does it matter if you’re a bit more extreme? In this new digitalized and monetized influence environment, people not only have become the subject matter, they have also become the gamed.
Research into digital-age influence operations is revealing that these operations are rarely about changing what people believe. They are instead about confirming what people already believe — described by Alicia Wanless and Michael Berk as a “participatory propaganda” model. This requires little of the sophistication of analogue PsyOps. In contrast, it involves flooding people with confirmation bias for a given belief, and starving them of opportunities to question and doubt other beliefs. It is easy to see how toxic binaries form organically in communities that can then be exploited by network routing. The echo chamber effect was a key component of Russian interference in the 2016 U.S. election.
Blurring the civilian and military boundary, the exploitation of these and other quirks of human cognition has become a massive industry. In the civilian domain “captology,” the use of “computers as persuasive technologies,” was a term coined in 1996 by Stanford University Professor B.J. Fogg. Research into the cognitive processes occurring in human-computer interaction has its own acronym as a field of study: HCI. Troves of literature on these topics now exist, and governments around the world, including Australia’s, are standing up “nudge units” which deploy these tools and methods, and the World Bank is a leading advocate.
As with all technologies, some people will seek to use them to better the lives of other human beings. Others will use them to exploit and manipulate. Nothing defines technology more than this intractable fact. We have opened a Pandora’s Box of methods and means which, driven hard by both commercial and military interests, have made their way into society long before we have anything like a grasp of what that might mean.
The national security, intelligence, and defense communities are adapting its concepts. The Australian Army’s new futures statement Accelerated Warfare captures the challenges of constant competition and the strategic threat to the socio-political system in the digital age. The army’s unique value in this changing landscape requires it to augment its traditional kinetic combat strengths with persistent engagement. The old binaries of wartime and peacetime, and kinetic and non-kinetic, force employment confine engagement to episodic and short-term operational activities. These binaries are fully challenged by this new environment.
Accidents and Side Effects
The central myth of Silicon Valley ideology was captured in Stuart Brand’s maxim that “information wants to be free.” This must by now be understood for the falsehood that it is. If information wants anything, it is to be controlled. As Scott Malcomson has shown, states are now competing for the political geography of cyberspace in a way that has fragmented a short-lived global Internet. Influence in these networks is spread across the digital stack , from the submarine cable to the human-computer interface, but the primary hub of power is at the level of network routing. Controlling the flow of information at this level is the state’s business again, and it will likely kill off all remaining hubris about a global online community, not to mention the absurd notion that greater connectivity is an unalloyed good.
While states wrestle back control of network flows, however, a fundamental shift has taken place in the way human beings interact with information. Human beings in the digital age are participants in a new and mostly accidental shift in the way they interact with knowledge, power, and authority, with implications for everything they do. It is hard to say if a new status quo will embed and stabilize — giving social and political institutions the chance to catch up, deliberate, and adapt — or if the current interregnum is a more permanent condition.
The captologists are an unrelentingly optimistic bunch. Unfortunately, the belief that these practices will likely yield a balance of benefit for humankind is not based on science, but on ideology.
Dr. Zac Rogers is a research lead at the Jeff Bleich Centre for the U.S. Alliance in Digital Technology, Security, and Governance at Flinders University of South Australia. He is currently lead researcher in a three-year defense/academic collaborative project exploring the impact of digital transformation from infrastructure to the human/computer interface on Australia’s internal and external security, national interests, defense planning, and strategy.
Dr. Emily Bienvenue is a senior analyst in Joint Operations and Analysis Division of the Defence Science and Technology Group where she provides support to strategic policy and operational planning. Emily is adjunct and research lead at the Jeff Bleich Centre.
Dr. Maryanne Kelton is deputy director at the Jeff Bleich Centre. Currently, Keltoni s working on a strategic multilayered research project on socio-cognitive security, uncertainty, and trust in the digital age.
The views expressed here are the authors’ own and do not represent the official view of the Australian Defence Department.