A password will be e-mailed to you.
Hide from Public

Whodunnit? Russia and Coercion through Cyberspace

October 19, 2016

Late in May 2014, a group calling itself CyberBerkut leaked a map of the Ukrainian Dnipropetrovsk Oblast administration’s IT resources, information on the Central Election Commission of Ukraine’s servers, and the correspondence of its staff. In the following days, which included the country’s presidential election, CyberBerkut claimed they had again compromised the election commission’s servers, leaked more confidential information, conducted a distributed denial of service (DDoS) attack the commission’s website (which instructed potential voters how and where to vote), and blocked the phones of election organizers. The group also released documents implying that the recently appointed governor of the Dnipropetrovsk Oblast, Igor Kolomoisky, was complicit in pro-European Ukrainian plans to promote the “correct” candidate for president of Ukraine.

Despite the best effort of the Russian group behind CyberBerkut, the center-right, pro-European Petro Poroshenko won the Ukrainian presidency. But CyberBerkut wasn’t finished. Almost exactly five months later, the group used similar tactics in the days preceding the Ukrainian parliamentary elections. The results were largely the same:  Pro-European candidates won the majority of seats. An uninitiated observer might be keen to discard these events as failed electioneering. After all, Moscow did not succeed in getting its men elected. But to label the operation a failure is to assume that the primary goal was to get pro-Russia officials elected. Over the course of the past four months, we have seen similar operations unfold in the United States, and — as was the case in Ukraine — there are reasons to believe that swaying the election is not the primary objective. Just as in the case of the CyberBerkut incidents, among the key observers of these operations in the United States have been cyber-security firms like FireEye. The manager of their information operations analysis team recently shared some of their findings with me, which informs the analysis below.

On the surface, the United States has been targeted by a series of cyber operations that have resulted in email and other confidential information falling into the hands of individuals or groups that go by the names “Guccifer 2.0”, “DCLeaks”, and “WikiLeaks.” These monikers have then strategically leaked the stolen information in a way designed to sway U.S. public opinion. Due to the content and timing of the leaks, some posit that this is an attempt by an outside power, Russia, to nudge the U.S. general election to a Trump victory.

This view, however, is myopic and betrays a lack of understanding or simplification of Russian foreign policy and influence operations. Beyond simple electioneering, what we are experiencing is a broader attempt by the Russian government to seed uncertainty in the institutions that underpin American democracy and power — both hard and soft. As Dmitry Adamsky notes, Russia’s strategic doctrine “is primarily a strategy of influence, not of brute force,” which seeks to break “the internal coherence of the enemy system—and not about its integral annihilation.” Julian Assange, the founder of WikiLeaks, espouses a similar philosophy in considering how to “radically shift regime behavior.”

As everyone who has read a newspaper in the last few years knows, relations between the United States and Russia are strained. Before the Department of Homeland Security and Office of the Director of National Intelligence outright pointed the finger at Russia for the hack of the Democratic National Committee’s server and other campaign-related compromises, several private cybersecurity companies had attributed the data breaches and subsequent leaks to Russia. Here, I will describe how attribution works in a case like this, highlight some of the key data points that show the Russian state is behind this operation, unpack what these events reveal about Russian organization and motivation around cyber and influence operations, and explore options for a U.S. response.

How Does Attribution of a Covert Operation Work?

All cyber operations are covert, at least in the planning and execution stages. Russia’s cyber operations are no different. According to data provided for this article by the private cybersecurity company, FireEye, two separate but coordinated teams under the Kremlin are running the campaign. APT 28, also known as “FancyBear,” has been tied to Russia’s foreign military intelligence agency, the Main Intelligence Agency or GRU. APT 29, aka “CozyBear,” has been tied to the Federal Security Service or FSB. Both have been actively targeting the United States. According to FireEye, they have only appeared in the same systems once, which suggests a high level of coordination — a departure from what we have seen and come to expect from Russian intelligence. So how does an intelligence agency or company go about attributing behavior that is by definition secretive, and does the fact that this activity is taking place (mostly) online change the procedure?

In essence, experts rely on a bevy of technical and non-technical trend data, or what FireEye threat intelligence analyst Will Glass explains as “the careful accumulation of multiple pieces of evidence in sufficient quantity of time.” These pieces of evidence include things like the scope, scale, and sophistication of an operation, the tactics, techniques, and procedures (TTPs) the team employs, as well as any discernable motivation for the attacks. These factors combine to form what Glass describes as the “fingerprint for their activities,” and “informs an analytic assessment of who is likely responsible.”

Thomas Rid and Ben Buchanan further unpack this fingerprint by breaking the attribution process out into three tiers: technical, operational, and strategic. For our purpose, understanding how attributional evidence works, we will depart slightly from their framework and use the same descriptive tiers to discuss the relevant evidence at each level.

The technical layer includes indicators of compromise (like unusual network traffic, anomalous user activity, and geographical irregularities in logins), atomic indicators (like “IP addresses, email addresses, domain names, and small pieces of text” used by the attackers), and the specific tools and malware deployed by the attackers.

The second, operational layer can be described to analyze the human side of an operation. It is at this level that TTPs, like the mode of entry — was it a spear phishing attack, a spoofed website, or some other form of entry? —  the stealthiness of the attack, and the way the attacking team operated once they gained access to the system, bear relevance. Finally, the third, strategic layer helps contextualize the event.

The strategic layer widens the aperture of the intelligence analysts’ lenses and allows the would-be attributers to examine things like concurrent and relevant global trends and geopolitics that may help connect technical and operational dots. When the adversary, or group that perpetrates the hacks, also strategically leaks the information obtained from the hack or hacks, the lines between dots of evidence become increasingly clearer.

How Do We Know It Was the Russians?

With the technical, operational, strategic framework in mind, what makes these cybersecurity companies and the U.S. government so confident that Russia is behind these hacks and leaks?

In addition to the usual timestamps and TTPs, one major piece of technical evidence presents itself. According to Chris Porter, who heads FireEye’s strategic intelligence teams, the tools used to compromise the Democratic National Committee, the CHOPSTICK and SeaDaddy malware, are tools that we have only ever seen used by APT28 and APT29 respectively.

Operationally, the way the malware is deployed fits with what we know about Russian intelligence’s offensive cyber operations. They typically either spoof a website or conduct a targeted spear phishing campaign to install a dropper and eventually achieve remote access to machines and infrastructure. Furthermore, FireEye has discerned patterns in the registration of the fancybear.net and dcleaks.com domains that appear to “match up with the domain registration behavior seen from APT28 in the past.”

The modus operandi to spread the hacked information also betrays Russian signatures. First, the agents behind the attacks appear to be coopting well known hacking brands, like Anonymous, Guccifer, and PravSector (a Ukrainian political organization). But the activity of these spoofed identities does not comport with the activities we have come to expect from hacktivist groups and their loose affiliates. For example, the tweets and other social media activities undertaken by a group calling themselves the “official Anonymous Poland Twitter” (@AnPoland) strangely received no attention from other factions within Anonymous when they attempted to spread leaked World Anti-Doping Agency data. And the reasons to believe that Guccifer 2.0 is not who he says he is have been well documented. In short, these groups have assumed identities that seem to tie them to established hacktivist groups, but there is no evidence of any actual affiliation between these new monikers and the established hacktivist brands.

Certain tactics around messaging and timing correlate to what FireEye observed of Russian information activity in Ukraine around the annexation of Crimea and the military action in eastern oblasts. In Ukraine, the group CyberBerkut appeared to run both the network operations (the hacking to steal the sensitive data) and the information operations (the media outreach to disseminate the information). The skillset required to successfully conduct the relatively complex network operations is very different from the skillset needed to effectively leverage an information operation. We have seen the same trends with Guccifer 2.0, DCLeaks, and WikiLeaks. Moreover, according to FireEye’s team, the fact that the data breaches and the information leaks (or announcements that the information is to be leaked) happen in quick succession suggests a team structure with a healthy division of labor, discussed in greater detail below.

Finally, when we take the strategic context into account, the picture sharpens. We have seen the Russian government attempt to manipulate narratives in a way that suits their interests perhaps more than any other state. In addition to the campaign in Ukraine, there is strong evidence to suggest that the Russian state has attempted to coerce political narratives in Estonia, Czech Republic, and within their own country.

Further, as of January 2016, influence operations were officially engrained in Russian national strategic doctrine.  According to the doctrine, the national security organs of the Russian state must continue to be prepared for

growing confrontation in the global information space, due to the desire of some countries to use information and communication technologies to achieve their geopolitical goals, including through the manipulation of public opinion and falsification of history.

What Can We Learn About the Russian Playbook?

As many others have observed, this type of activity is not exactly new. It is just new that the United States is on the receiving end. Russian information operations do not necessarily push a cohesive message. Instead, they tend to identify key audiences and feed information specifically intended for that group. This leads to inconsistent and even contradictory messaging. In a way, this plays into the hands of the Russian operators whose goal is to sow uncertainty and dissolve confidence in any dominant narrative.

Russian intelligence agencies have been investing in this capability for years, and the organizations appear to retain knowledge over time with regard to how to both organize and operationalize a campaign. As mentioned above, there is reason to believe that a division of labor has occurred within the teams conducting these operations. At least four discrete skills are needed. First are the on-keyboard operators, who are tasked with the network operations, or hacking, portions of the campaign. In support of these on-keyboard operators are researchers who provide the on-keyboard operators with the tools to carry out the job. These tools can be technical tools, like malware, or social engineering instructions. Third, are some more ordinary code developers that help scale the operation and provide the backbone upon which all the others operate. Finally, there are the information operation specialists.

Because the information operators target specific populations and specific journalists with specific information, there is reason to believe that the information operations specialists possess an above average understanding of the local politics and political factions within the United States. Take, for example, the concerted effort to feed damning DNC information to Gawker and the Smoking Gun, two left leaning media outlets, during the Democratic primary. This was a nuanced effort to reach a far left, Sanders-supporting audience to stir up discontent with American institutions, the Democratic Party, and primary. Due to this team structure, and regional specialization, the Russians have been enabled to leverage the pervasiveness of social media to reach their intended audiences in a way that simply was not possible before Facebook and Twitter. According to FireEye’s Information Operations Manager, the overall complexity of these teams is on a level similar to that of U.S. intelligence agencies and is only likely to be housed within a government agency.

What is Russia’s Overarching Goal?

Perhaps unsurprisingly, given the Russian’s ability to pick out specific messages for specific audiences, several complementary goals appear, aimed at different parts of U.S. society: a general audience and the political elite that are in-tune with national and international security policy.

For the general audience, the goal is likely two-fold. The first is to shake Americans’ confidence in public institutions, to include political parties, democratic processes, and the media. The second, slightly less obvious, goal is likely to deflect some attention away from other Russian actions around the world, like their ongoing questionable operations in Syria and Ukraine.

The U.S. national security intelligentsia likely also see three additional goals.  First, Moscow is signaling to the U.S. government, in response to the Snowden revelations of the sophistication and advanced nature of the National Security Agency’s capabilities. Second, and tied to that, this is an attempt to gain a bit of attention and recognition for Russian cyber capabilities and prestige on the world stage. Finally, this is likely an attempt by Russia to figure out where America’s redline might be in this context.

What is to Be Done?

The Russian actions have put the Obama administration in a sticky situation for a number of reasons. There is little the administration can do that would dissuade these operations, because, with Russia’s still plausible deniability, the kind of responses the administration would need be rather severe to make the Russian’s cease operations and could risk an escalatory response form Russia. In order to withstand that type of response from Russia, the administration would need the American public behind them — a tenuous prospect at best.

While the U.S. government’s hands are somewhat tied, there are a couple of simple actions that U.S. organs could take to both inform the Russians that this will not stand and help reshape the narratives the Russian operations have distorted. First, the U.S. government could expel SVR and GRU operatives posed in Washington under diplomatic cover. This is a relatively common tactic to inform an adversary that their intelligence operations in your country are approaching an unacceptable point. Second, U.S. media can do a better job of pointing out inconsistencies in the narratives that the Russians have constructed, as Kurt Eichenwald did last week when he pointed out that he was not, in fact, Sid Blumenthal, despite Russian and Trump camp insistence to the contrary.

 

Robert Morgus is a Policy Analyst with New America’s Cybersecurity Initiative where his research focuses on the intersection of international affairs and cybersecurity. Morgus has spoken about cybersecurity at a number of international forums including NATO CCDCOE’s CyCon, the Global Conference on Cyberspace at The Hague, and CyFy 2015 in New Delhi, India. He contributed a chapter to the upcoming book, Cyber Insecurity, and his research and writing has been published in The New York Times, Slate, the Institute of Electrical and Electronics Engineers, peer-reviewed academic journals, and numerous other national and international media outlets. Special thanks to Aylea Baldwin, Chris Porter, Will Glass, and the rest of the team at FireEye for the data and insights they provided for this article.

Image: Adapted from Pexels and Faloomabinga, CC

Leave a Reply

You must be logged in to post a comment.

3 thoughts on “Whodunnit? Russia and Coercion through Cyberspace

  1. Great write-up, thanks to the authors!

    Wheels in wheels is as tempting as any other conspiracy and has the same traps.

    Russians can’t think we would appreciate their endorsement, much less active support, of an individual candidate.

    So are they offering negative support to Hillary with less than extrordinary revelations? Or hamfisted enough to think we take the data at face value? Or poisoning Trump with what appears to be too cozy relationship? You cannot get too convoluted for Russians.

    Ahhh, welcome back to the New Cold War! Just like the Old Cold War only more so.

  2. Many thanks to FireEye for sharing, but on to government responses:

    These efforts are most blatent with Russia, but not exclusive as the Chinese are running similiar trolling operations in social media. The most effective first step is exposure: while these efforts are well known in the cyber and intell realms, and are discoverable to any willing to do the research, but the fact is you will hit many sites and do a lot of reading to discover the relationships.
    1. DOX the relationships.
    Once sufficiently exposed the sources are interrupted and those who help spread it are discredited.
    2. DOX the participants
    It is a tremendous amount of work to chase these things back to individuals, but we do. Once we have, go public when it suits us.
    3. DOX the backers
    Putin is held in power by a relationship of gangsters become oligarchs. Start a campaign within Russia of Here is Why You Can’t Have Nice Things to erode support. Your factory was perfectly viable until “This Person” decided to steal all the operating budget and hide it in Montenegro under This Account number. With no operating capital your factory had to shut down and you are unemployed.

    Of course that opens the field to the same sort of scrutiny on our businesses and relatiobships, but I suspect we can take the scrutiny better.