Defending the Indefensible: A New Strategy for Stopping Information Operations

disease

In the book Snow Crash by Neal Stephenson, malicious viral information is deliberately spread by a nefarious actor to infect computers and people’s brains. The virus is transmitted in a variety of ways: via bodily fluid exchange, exposure by observing code with your eyes, as an injected drug, and via computer networks. Regardless of the method, the key idea is that malicious information can infect software on computers and also the software running in your brain.

As a cyber-intelligence analyst with a degree in modeling & simulation, I’ve spent a lot of time trying to fit information operations into a cyber-attack model. Information operations, sometimes referred to as information warfare or political warfare, have been used for centuries by many different entities but have recently regained prominence. According to the RAND Corporation, “Information operations and warfare, also known as influence operations, includes[…]the dissemination of propaganda in pursuit of a competitive advantage over an opponent.” Historical information operations include Soviet propaganda blaming the United States for creating AIDS as a biological weapon and ISIL falsely claiming responsibility for attacks such as the 2017 massacre in Las Vegas. And of course, contemporary observers are familiar with Russia’s use of information operations in an effort to influence democratic processes around the world.

I had the epiphany that information operations don’t fit the normal cyber-attack models such as the Lockheed Martin Cyber Kill Chain® because they’re not “conventional” state-sponsored cyber-attacks such as the OPM breach or the Sony Hack. Those attacks often take advantage of a software exploit, which is used to execute code on the victim’s device. In the last few years, instead of exploiting software, cyber-attacks have started to exploit people. This typically takes the form of social engineering a victim into enabling macros or trusting content that they should not. Information operations, though they are attempting to accomplish different, more politically motivated goals, also seek to exploit people rather than software.

In this way, information operations more closely resemble the introduction and spread of a computer worm such as NotPetya, which was deliberately introduced via a server but also propagated via infected computers. In cybersecurity, understanding a computer worm is often defined primarily by how it spreads from host to host, how it can be stopped, and what impact it has. The host-to-host transmission process for computer worms is related to that of biological viruses, which offers a more fruitful analogy for information operations.

Both computer worms and biological viruses use a host’s resources to replicate and would be helpless without this host. Information operations are the deliberate introduction of viral information, which then uses brain resources to replicate and “infect” new hosts. Computer technology can facilitate the propagation, but unless people are convinced (infected), the information will not continue to propagate and the operation will be ineffective.

Here is the idea I’m trying to infect you with: We can apply the Centers for Disease Control and Prevention (CDC) strategies for combating emerging epidemics to contain the spread of viral information and reduce the effectiveness of information operations. The CDC’s strategies include educating the public, training responding personnel, using science and technology to understand transmission and treatment, and identifying specific areas in need of additional resources. That’s right, the CDC isn’t just for stopping zombie outbreaks. Its recommendations for containing emerging epidemics can help fill the gaps in, and improve effectiveness of, the response to information operations.

What’s Changed?

Russia has been conducting information operations to influence public opinion and political dialogue in the United States and other nations, using conventional media, bots on social media platforms, advertising campaigns, working through proxies and personas, and sometimes in mixed mode campaigns that also utilize Computer Network Exploitation. These two operational modes can be combined for greater effectiveness, for example, by using the Guccifer 2.0 persona to release hacked emails while also running troll armies and cooperating with platforms such as Wikileaks to increase dissemination of false information about those emails. Some stories would organically spread from one medium to another as websites and journalists would repeat stories introduced elsewhere.

The effectiveness of their operations is hard to measure and difficult to separate from the activity of domestic actors. Russian information operations are still targeting and are likely to continue to target the United States, its allies, and its areas of interest. Emerging technologies are likely to significantly increase the impact of future information operations.

Why Is This So Hard to Fix?

One similarity between information operations and epidemics is that it’s unclear who is responsible for combating them. Just as the CDC counsels for epidemics, in the absence of herd immunity, addressing misinformation and information operations will require cooperation across government, the private sector, the media, and the general public.

Each of these entities, on their own is ill-equipped to handle the problem of information operations. Governments are not suited to fact-checking false information, as they may operate too slowly and relevant data may be classified or difficult to release. Moreover, some may be skeptical of governments “policing” the spread of information, a legitimate credibility concern for both epidemics and information operations. Take, for instance, the rejection of legitimate vaccinations in Pakistan after it was revealed that the CIA used Polio vaccinations to help find Bin Laden.

Some have assigned the responsibility for combating information operations to media outlets. In my private discussions, cyber and counterintelligence experts have suggested engaging neutral third parties such as Bellingcat to call out and fact-check disinformation. Unfortunately, these actors lack the resources and viewership numbers needed for widespread information dissemination. Moreover, in many cases outlets and platforms have direct financial motivation, in the form of advertising dollars and other revenue concerns, to facilitate information operations and not to fact check them. Finally, fact-checking alone may be too slow, can sometimes accidentally amplify misinformation, and may not sufficiently counter propaganda. As the saying goes, a lie can get halfway around the world in the time it takes the truth to put its pants on.

The U.S. State Department Global Engagement Center is “charged with leading the U.S. government’s efforts to counter propaganda and disinformation from international terrorist organizations and foreign countries.” However, the State Department is focused on foreign policy, not exerting influence domestically. Moreover, the GEC has faced serious limitations since being created in April 2016.

How Do We Stop the Infection?

Another issue that will be familiar to disease control experts is that of prevention versus treatment: Can we stop the infection from taking hold in the first place, or should we focus on addressing it after the fact? As far as is publicly known, the U.S. government has not conducted kinetic activity in retaliation for Russian information operations. It has taken a number of other punitive actions including imposing sanctions on Russia, and indicting employees of the Internet Research Organization.

Recommendations from the CDC, translated to apply to information operations, suggest that rather than retaliatory actions, the focus should be on prevention – or, in national security parlance, deterrence. Here are some proposed or enacted solutions borrowed from infectious disease control.

Training

The United States and other vulnerable countries need basic education for the public on “information hygiene” – analogous to contraceptives, public education, and other preventative measures taken to limit the spread of biological epidemics. The equivalent might be awareness campaigns such as Learn to Discern in Ukraine or crisis communication in Scandinavia, along with other methods such as games or even short videos or memes. Similarly, we need to train those in media whose work directly impacts the spread of an information operation (the equivalent of medical personnel). Resources such as First Draft News can be a good start in this regard.

Identify Patterns in the Introduction and Transmission of Misinformation

Precursors of information operation activity can be identified and shared, just as with symptoms of disease. Experts may be able to use creative methods to degrade the information operation before it progresses, including inoculating those who might retransmit it. Russia effectively announces the focus of every information operation by publishing related content in RT and Sputnik. It would be easy to use this content to identify the likely targets of information operations and act to protect them, for instance by notifying victim organizations and engaging fact-checkers.

I also suggest digital watermarking to interfere with forgeries, using file metadata for tracking purposes, and employing data science principles to analyze social media activity. Various experts have cited a lack of research as an obstacle to a coordinated response. Data science and timeline analysis can help identify higher-risk populations and primary transmission methods. In some cases, populations’ risk increases when they self-segregate into separated communities – something that may sound familiar to Facebook users. The analogy of quarantining a high-risk population from infection vectors is valid, but is likely to be more challenging in this case, as information can propagate in many more ways than a biological virus can.

Empower Effective Responders with Resources and Tools

It is important to empower the State Department’s embattled Global Engagement Center, as a RAND paper recently recommended. Combating information operations is the center’s explicit mission, and it already has experience working on countering propaganda from ISIL. Fact-checkers such as FactCheck.org, PolitiFact, Hamilton 68, and Snopes, which can be considered analogous to emergency first responders, should be empowered and recognized for their work as well.

Tools for analyzing trends and activity are being developed for internet activity, but more are needed and equivalent tools for other forms of media, such as television, need more development. Facebook has taken a number of well-publicized steps, including creating the Fake News Challenge to build machine learning solutions for misinformation. The winner, Cisco Talos, posted their code on Github. However, information operations can utilize other platforms and analysis tools must keep pace. HIV can spread through sexual contact, so focusing only on shared needles would be ineffective. In the same way, Reddit, YouTube, Twitter, LinkedIn, 4chan, Tumblr, and other platforms need improved tracking of bot activity, more robust reporting mechanisms, more effective safeguards against mass bot creation, and effective mass mitigation methods.

Conclusion

Information and misinformation have always spread virally. But today, technology has increased the speed, number of vectors, and patterns of transmission. By using CDC strategies to contain information operations, we can apply proven ideas to counter a complicated problem that will exist for the foreseeable future. The lessons learned from countering Russian information operations can also be applied to the potentially greater long-term threat of information operations by China.

The analogy is not perfect, of course. Comparing misinformation to HIV/AIDS or Ebola misrepresents its risk and impact. Comparing media and fact-checkers to medical personnel and first responders has a number of flaws as well. Still, to return to my modeling & simulation background, George Box wrote that “all models are wrong, but some are useful.”

For now, I encourage you to improve your immune system by thinking critically about the information you consume, evaluating the source, seeking out balanced and reliable information, and encouraging those around you to do the same.

 

Daniel Gordon, CISSP, CEH, GCIA, GCTI is a cyber threat intelligence analyst working for the Department of Defense Cyber Crime Center(DC3). He holds a BA in political science from St Mary’s College of Maryland, an MS in modeling and simulation from the University of Central Florida, and a graduate certificate in modeling and simulation of behavioral cybersecurity from the University of Central Florida. The views and opinions expressed in this article are those of the author and do not necessarily reflect the official policy or position of any organization or employer.

Image: NIAID/Flickr