The Great Online Convergence: Digital Authoritarianism Comes to Democracies

Democracy

Editor’s Note: This is the final article in a series on digital authoritarianism. The introductory essay can be found here. Steven Feldstein’s essay, “When it Comes to Digital Authoritarianism, China is a Challenge — But Not the Only Challenge,” can be found here. Jessica Chen Weiss’ essay, “Understanding and Rolling Back Digital Authoritarianism,” can be found here. The concept for the series emerged from a policy workshop hosted by Bridging the Gap and the Center for New American Security.

 

Digital autocracy is on the march. Troll armies, foreign disinformation campaigns, and Russian-exported conspiracy theories are all working to undermine American democracy. So we are told, with increasing frequency, by think tanks and politicians alike.

But this image of a battle between virtuous democracies and malicious autocracies obscures a growing trend: a convergence in how democratic and autocratic governments are using surveillance and disinformation to shape political life. The newest tactics of digital disinformation — trolling, strategic distraction, and conspiracy theories — are not only commonly deployed inside democracies, but thrive under democratic norms of free information flows. A recent report, for example, found that politicians inside 45 democracies have used social media for “amassing fake followers or spreading manipulated media to garner voter support.”

 

 

Displacing the blame onto foreign autocrats, while politically palatable in Washington, creates a false distinction in which an innocent democracy is being subverted by malevolent outside forces. Doing so distracts from the core problem — the immense incentives for disinformation built into democratic institutions.

That doesn’t mean democracies and autocracies are becoming indistinguishable. The extent of censorship and the consequences of protest are often worse in nondemocratic states. But in some key respects, the political uses of the internet in autocracies and democracies are becoming harder to distinguish.

A Brief Evolution of Internet Politics

The internet was initially assumed to be the dictator’s natural enemy, a “liberation technology” that would benefit open regimes and destroy closed ones. In some ways, just the opposite has happened. For autocracies, the internet is no longer just a threat but a powerful tool for strengthening and legitimizing the regime. For democracies, meanwhile, the internet has sparked an “epistemic crisis” in which the truth simply ceases to matter. The result is not a widening gap between open and closed regimes but an increasing blurring of the two.

How did this convergence come about? The blurring of regime tactics stems from the evolving uses of internet technology by political actors. Roughly speaking, since the turn of the century this evolution can be divided into four stages: blocking, filtering, co-opting, and flooding.

Blocking

This familiar strategy involves a brute-force denial of internet access, either entirely or selectively, to protestors or other potential threats to the regime. It remains the go-to tool for autocrats seeking to deal with perceived dangers to the regime. Last year, for example, saw 196 internet shutdowns, compared to only 75 in 2016. Continuing the upward trend, the first half of this year alone has seen 114 shutdowns in 23 countries. One common use involves “just-in-time” blocking, in which internet access is temporarily disabled for particular events, such as during a controversial election or anniversary.

Filtering

Filtering is more sophisticated than blocking. It involves autocratic regimes deciding which content is censored and which content is allowed to remain online despite its apparently controversial nature.

For example, some experts argue that China has not censored all “negative, even vitriolic” criticism of the government and its leaders. Instead, the Chinese government censored content that threatened to spur collective action via organized protest. The ultimate goal here was not to silence all criticism but to “reduce the probability of collective action by clipping social ties” wherever they appeared.

Critics of this research note that filtering is not a conscious top-down strategy, since Chinese censorship is not “a uniformly enforced, top-down system.” Instead, content filtering occurs at the local level, leading to varied implementations. Chinese internet control, in other words, is too “decentralized and fractured” to speak of any coherent national strategy. The end result, however, is the same. Certain information gets through while other information is filtered.

Co-opting

In this strategy, governments move beyond merely suppressing online discourse, and proactively subvert and co-opt social media for their own purposes. For example,, social media has increasingly enabled nondemocratic regimes to fulfill some key regime-enhancing functions. They can gather previously hidden or falsified information about public grievances. They can increase the transparency of the performance of local officials, bolster regime legitimacy by shaping public discourse, and successfully mobilize their own support base against protest movements. Increasingly sophisticated online tactics have enabled dictators to overcome barriers historically associated with autocratic rule, such as informational scarcity.

For example, China’s former president Hu Jintao told The People’s Daily that the internet “is an important channel for us to understand the concerns of the public and assemble the wisdom of the public.” And Russian opposition leader Alexei Navalny observed that the Putin regime uses the internet as a “focus group” to find out the concerns and desires of ordinary Russians. In this way, social media allows autocrats to get a much clearer view of people’s real opinions — and therefore anticipate potential unrest — without prying open the larger marketplace of ideas. Similarly, Xiao Qiang, editor of the blog China Digital Times, has argued that outrage on social media is sometimes the only channel for party officials to get honest feedback about their local apparatchiks.

In sum, autocrats have moved beyond strategies of “negative control” of the internet, in which regimes attempt to block, censor, and filter the flow of communication, and toward strategies of proactive co-option in which social media helps regimes survive.

What are the implications of autocratic co-option of internet technology?

First, citizen participation in social media may not signal regime weakness, but may in fact enhance regime strength and adaptability. Even the harshest dictators have an incentive to allow some degree of social media freedom — enough to gauge public opinion but not so much that discussion spills over into protest. Consequently, evidence of online debate is not necessarily a sign of regime weakness but a harbinger of its durability.

Second, democratization in mixed and autocratic regimes may become stuck in a low-level equilibrium trap, as these regimes become responsive enough to subvert or preempt protests without having to undertake fundamental liberalizing reforms or loosen their monopoly over political control. While social media may make regimes more responsive at the local level, it also produces a shallow sort of democracy, in which populist causes like municipal corruption are taken up by the central government, sometimes with great fanfare, even as the chances of fundamental reforms like the introduction of multi-party competition become more remote.

Third, co-opting social media may help regimes inoculate themselves from the reach of transnational social movements as well as domestic reforms, portending greater obstacles for the diffusion of successful protest across borders.

Fourth, more speculatively, hybrid regimes may become increasingly less likely to use elections as a way of gathering information, revealing falsified preferences, coordinating elites, channeling grievances, and bolstering regime legitimacy. Why risk losing even a rigged election if less risky but similarly effective alternatives are at hand? Closed autocracies, which have traditionally existed in a particularly information-scarce environment, may have fewer incentives to introduce elections in the future.

Flooding

Traditional methods of digital autocracy — blocking and censoring — rely on suppressing free flows of information, and therefore don’t live easily inside democracies. But such tactics of control are now only one part of the picture. Instead, what we’ve seen more recently is the ascendance of a more subtle method — what people who study the phenomenon have come to call informational flooding.

Flooding relies not on controlling information flows but on facilitating them — even while the information itself is false, distracting, or otherwise worthless. (Which is why Russian observers, who have become deeply familiar with this method, call it “info-noise.”) Henry Farrel and Bruce Schneier, two experts on the politics of the internet, write that instead of relying only on censorship tactics of blocking and filtering, flooding seeds “public debate with nonsense, disinformation, distractions, vexatious opinions and counter-arguments, making it harder for their opponents to mobilize against them.”

For example, when faced with a damaging event like the Skripal poisoning, the Russian government’s response (operating in part through state-controlled media) was to flood the informational space with potential explanations, however implausible. The journalist Alistair Bunkall enumerated 19 theories put forward, with possible culprits ranging from suicide to Slovaks to Yulia Skripal’s future mother-in-law.

Flooding assumes that censorship will not always succeed, and in fact might be counterproductive. It therefore does not bear the clear stamp of autocracy like traditional methods of control. Instead, the focus is on deluging people with a variety of provocative, distracting, even blatantly false information. As a result, it lives quite comfortably inside democracies, is perfectly compatible with democratic norms, and is easily generated by democratic actors.

The goal of flooding is not to dominate the informational space but to dilute it. Eschewing the heavy hand of censorship, it instead produces a world in which, as Peter Pomerantsev’s 2015 book put it, “nothing is true and everything is possible.” The ultimate goal is cynicism, overwhelming distrust of all news sources, and the fragmentation of a shared social reality. The same free flows of information that make disrupting democracy by censorship harder also make the disruption of democracy by flooding easier.

Put another way, we don’t see much overt blocking inside democratic states. We do see, however, a lot of flooding, amplified by the very things that make democracy viable — the unfettered flow of information, and the presence of multiple political and media actors, each with their own competing strategies of persuasion.

Flooding and the Future of Democracy

The opposite of internet freedom, therefore, is not censorship but a mix of control, co-option, and strategic distraction. “Flood the zone with shit,” declared Steve Bannon in 2018 as his strategy for dealing with the media. The result, a fog of half-truths, leaves potential voters “numb and disoriented, struggling to discern what is real in a sea of slant, fake, and fact.” As in Pomerantsev’s Russia, nothing is true and everything is possible.

The growing pervasiveness of info-flooding has three implications for the evolution of democracy.

First, flooding has very different effects in autocracies and democracies. Democracies require a degree of social consensus to function, one created by a shared belief in a common reality. This consensus does not expect people to agree, but it does expect shared knowledge of what they disagree about. (To borrow from Daniel Patrick Moynihan, “Everyone is entitled to his own opinion, but not his own facts.”)

Autocracies, on the other hand, benefit from the distrust, cynicism, and social atomization produced by flooding, precisely because it inhibits political engagement and paralyzes organized social movements. Disinformation, social media echo chambers, “deepfakes,” and radicalization algorithms all serve to lower the costs of autocracy while undermining the basic prerequisites of democratic deliberation.

The rise of artificial intelligence will only exacerbate the problem. Programs like GPT-2, for instance, which can generate automated text indistinguishable from human-produced writing, will enable even greater flooding of the informational space without the involvement of human troll armies or obvious bots. A recent New Yorker article highlighted the danger, though again framing the threat as primarily external: “Russian troll farms could use an automated writer like GPT-2 to post, for example, divisive disinformation about Brexit, on an industrial scale, rather than relying on college students in a St. Petersburg office block who can’t write English nearly as well as the machine.” There’s no reason, however, to think that such abuse would be limited to foreign troll armies.

Second, flooding offers autocracies a way out of the so-called “dictator’s dilemma.” Allowing the increased availability of information is crucial for regimes desiring economic development, the argument goes, but also threatens their existence. Effective use of flooding, however, allows governments to reap the benefits of online technology without suffering its destabilizing costs. In the long run, the assumption that increased access to information will overturn autocratic regimes is probably unjustified.

Third, the problem of flooding is not a malevolent force that exists outside democracies. Russia will undoubtedly continue its attempts to influence U.S. politics using these methods (hence why it’s better to see Russian interference as a process, not an event.) But trolling and disinformation is not something primarily done to the United States by external actors like Russia. Instead, it also stems from domestic actors inside the country, and benefits precisely from the same free discourse that is crucial in a functioning democracy.

President Donald Trump himself, for example, has been a key source of disinformation since taking office, on issues ranging from family border separations to Sharpie-altered weather maps. Shortly after his 2020 State of the Union address, Trump posted a video of House Speaker Nancy Pelosi, edited to look as if she was ripping up Trump’s speech as he was thanking soldiers for their service. (Facebook has refused to take down the video.)

The president’s proxies and advisers, meanwhile, have likewise added to a steady hum of distraction and misinformation. An internal report by Fox News, for example, cautioned that several of its on-air regulars have spread disinformation about political events.

The Call Is Coming from Inside the House

Beyond the spread of disinformation, techniques of mass surveillance are also becoming adopted by governments across all regime types. With the collection and sharing of customer data as the driver of the digital economy, private monitoring (sometimes in partnership with law enforcement) is becoming a ubiquitous and socially accepted feature of life in democratic states. The U.S. government, meanwhile, has also undertaken massive surveillance programs of its own citizens, as revealed by Edward Snowden’s exposure of the XKeyScore program. When it comes to surveillance, a clear line between autocratic and democratic states is becoming increasingly harder to draw.

Even the traditional tools of digital autocracy, like blocking software, rely on products created and sold by Western democracies. A recent report on disinformation found that Facebook “remains the platform of choice for social media manipulation” in 56 countries. At the same time, autocratic states like Syria use technology produced by Western companies to suppress online dissent and block internet access.

As a result, the latest techniques of digital authoritarianism are increasingly indistinguishable from techniques employed inside democracies themselves. The same things that allow democracy to function — free flows of information and multiple competing actors — also make flooding pervasive and inevitable. An emphasis on foreign trolls, while politically convenient, creates an unhelpful distinction that obscures the root of the problem. As the horror movie has it, the call is coming from inside the house.

 

 

Seva Gunitsky is an associate professor of political science at the University of Toronto. He is the author of Aftershocks: Great Powers and Domestic Reforms in the Twentieth Century (Princeton University Press). Some of his work has appeared in International OrganizationInternational Studies Quarterly, International Theory, and Perspectives on Politicsas well as non-academic outlets like the Washington PostThe New Republic, and The American Interest. Twitter: @SevaUT.

Image: Flickr (Photo by Hollywata)