Moscow uses hacks globally to divide and confuse

Confusion about Russia's role in hacking and disinformation campaigns blunted initial responses from the West

The Washington Post published a piece of

What next

The Russian government has enjoyed considerable success using hacking and disinformation campaigns over the last two years to create divisions in foreign states and weaken their resolve, thereby strengthening Moscow's position internationally. Such interference will continue with varying degrees of success. Western governments will aim to formulate a more coherent response to such actions: investing in government-wide cybersecurity and restricting the ability of disinformation campaigns to spread quickly. International cooperation will play an increasingly important role as governments share experiences of how disinformation campaigns have worked, as well as possible defensive initiatives.

Subsidiary Impacts

  • Disinformation campaigns have the potential to erode confidence in democracies and exacerbate political polarisation.
  • Social media outlets will come under greater pressure to regulate their platforms.
  • Governments will invest further and spend more time enhancing internal cyber security procedures.

Analysis

Hacking and disinformation campaigns affected the 2016 US presidential election campaign and there has been interference in the elections of various European states, such as Germany, Bulgaria and France.

Russian involvement

Western intelligence agencies attribute approval for these attacks to Russia President Vladimir Putin. While the technical details on the attribution have not been published, the attributions are backed up by reliable sources in outlets such as the Washington Post (paywall) and the New York Times.

Russian intelligence agencies are believed to have close relationships with groups of hackers that they direct to carry out hacking campaigns (see RUSSIA: Arrests unlikely to derail cyberespionage - February 21, 2017).

Putin said on June 1 that some cyber operations originating from Russia might come from "patriotically inclined" actors. This comes closer to cybersecurity sector experts' descriptions of networks of notionally freelance hackers controlled by Russian security agencies (see RUSSIA/US: 'Freelance' hack claim to blunt US probe - June 2, 2017).

Disinformation strategy

Russian attempts to spread disinformation in Europe fall within a broader strategy known as 'active measures', defined as semi-covert or covert intelligence operations that are intended to shape an adversary's political decisions.

One way disinformation campaigns have sought to do this is by enlarging the divide between society and governing elites. Although this is a long-standing strategy, there have been marked divisions within many states since the 2008 financial crisis that have been easier to exploit.

Furthermore, by discrediting a newly elected government, disinformation campaigns force the incumbent to look to domestic politics rather than focus on foreign relations -- in this case, Russia (see RUSSIA/EUROPE: Moscow will exert multiple pressures - December 8, 2016).

Russian intelligence services have used this approach for decades. The use of active measures in recent elections is not altogether new; rather, recent instances represent adaptations of established tactics to a new technical environment.

Macron's defence

Recent disinformation campaigns have caused significant disruption to the political dynamics of many Western societies.

The Macron campaign's hacking response can be replicated

Yet one example stands apart: that of French President Emmanuel Macron. Two days before the May 7 French presidential election, nine gigabytes of emails from Macron's En Marche party were released online as a collection of torrent files.

However, its impact on the Macron campaign was limited. The Macron team's defence had been successful: it announced that it had planted fake documents among the leaked ones, making the public doubt the integrity of the leaks from the outset. His team also created fake logins and emails, inundating potential hackers with information that slowed down the penetration of En Marche's systems.

As a result, the leaks were released shortly before a media blackout imposed by French electoral law, denting their potential impact.

Opportunistic messaging

The success of the Macron campaign can be replicated, but Russian disinformation campaigns are unlikely to subside.

Disinformation campaigns aim to amplify distorted views

Moscow is following a policy of 'damage, confuse and divide' countries in order to ensure Russia's influence on the world stage. Its message is nuanced: it may not be able to reach living standards elsewhere, but it aims to show that other states are not as clean or well-run as they purport to be by leaking information about them and co-opting politicians (see EASTERN EUROPE: Russia may undermine liberal democracy - February 2, 2017).

There does not appear to be a consistent strategy behind this, but Russian operatives will take opportunistic advantage of weak links and aim to amplify distorted views that exacerbate cleavages in society.

Russian interference in the US election, for example, initially started by trying to damage Democrat Hillary Clinton, as the hacks at the Democratic National Committee in mid-2016 suggest. They then appear to have focused on enhancing Donald Trump as he came to the fore as a candidate, potentially because he was an outside spoiler and friendly toward Russia. Hacking only turned towards electoral sites at the end, possibly as a way to undermine public confidence in the integrity of the electoral process and to ensure that whoever was elected arrived in office under a banner of mistrust and scepticism by large portions of US society.

The Russian success in interfering with the US electoral process may be more due to the peculiarities of this particular election than due to the effectiveness of hacking and disinformation operations.

Western responses

Still, the US election hackings have sent shockwaves through Western policymaking circles, which are introducing measures to reduce the success rate of future disruption.

One area of that focus is cybersecurity within government. A significant number of government staff are neither trained in cybersecurity nor see it as a priority. However, many disinformation campaigns rely on 'phishing' practices where users may be tricked into clicking through to a fake webpage that requests their password or opening an untrusted email attachment. Improving the skills of personnel will raise the cost and time it takes adversaries to obtain internal documents.

Even before the June UK snap election was announced, the UK Government Communications Headquarters (GCHQ) announced it would run seminars for politicians and civil servants on Russian disinformation strategy and how they can improve cybersecurity.

There is also more attention and scrutiny on social media, such as Twitter and Facebook, which play a crucial role in facilitating the spread of disinformation. Twitter especially made it easy for accounts managed by 'bots' -- automated accounts that spread propaganda or fake news with no, or only minimal, human assistance -- to be set up, which disinformation operatives used to their advantage.

The key role that social media perform in the spread of fake news means that policymakers are placing greater legal and ethical responsibilities on them (see INTERNATIONAL: 'Fake news' fight focuses on internet - March 22, 2017). The German cabinet on April 5 approved legislation to fine social media up to 50 million euros (53 million dollars) if they do not remove hate content in a timely manner.

Governments will also be under increasing pressure to respond more assertively to disinformation campaigns. This could involve publicising disinformation campaigns to make the public more resilient and able to spot propaganda, 'naming and shaming' the aggressor country, or even the introduction of economic sanctions, thereby raising the stakes for aggressors.