Executive summary

The reach and speed of the Internet and social media have escalated the potential impact of disinformation. Increases in data transmission capacity coupled with a shift towards programmatic advertisingFootnote 1 have resulted in a precipitous decrease in the ability of traditional journalism to mediate the quality of public information. Conventional journalism has been partially displaced by a torrent of data from an infinite number of originators. Within that torrent is a current of lies and distortions that threatens the integrity of public discourse, debate and democracy.

Agents of disinformation: The actors

Disinformation has become a highly effective tool for state actors, profiteers, status seekers, entertainers and true believers. The most skilled national purveyor of falsehoods is Russia. Its historic mastery of ‘special measures’, magnified by modern technology, follows the basic operational principle of vilify and amplify:

  • Russia’s adhocracy, the shifting elite around President Vladimir Putin, directs an extensive network of Internet trolls and bot networks which generate and spread material across the web. Their activities are intensified by the support of diplomats, state-controlled media outlets such as RT (Russia Today) and Sputnik, as well as de facto alliances with organisations such as WikiLeaks;
  • Working together, these agents of the Russian state can create a false story and ensure it reaches the segment population most likely to be influenced by it through Facebook, Twitter and other channels. They also appear to corroborate the story through news agency interviews featuring phoney experts, forged documents, and doctored photos and videos. Anyone who challenges the lies becomes a target for high-volume online vilification; and
  • Russia, China and the Philippines use disinformation techniques to control their internal populations. Russia stands out for its highly organised strategy of using disinformation to interfere with the political systems of other countries, influence the political views of its citizens, and create and exacerbate division and distrust.

Both Moscow and Beijing have developed sophisticated information doctrines as part of their strategy to consolidate control domestically, and to advance foreign-policy objectives. Both coordinate messages across multiple platforms, with consistent lines advanced through regular news outlets and social media in many languages. Disinformation serves immediate and longer-term strategic objectives. There are important differences, however, between the Russian and Chinese approaches:

  • Russia attempts to alter the perception of reality, and identifies exploitable divisions in its target audiences. It pushes a nationalist agenda more than an ideological one and targets the Russian population to prevent dissent. The surrounding band of states which were once part of the USSR are attacked with messages which may ultimately support hybrid warfare. Operations against Western populations aim to weaken resistance to Russian state objectives. In supporting Syria, Russia has used disinformation to cover the brutality of its attacks on civilian populations;
  • China has created a domestic cyber fortress, and reinforced it with Chinese technology and Chinese high-tech companies. The messages projected domestically and globally are both nationalistic and ideological. Beijing uses its version of soft power to influence the policies of the international community, making effective use of economic power and the presence, in countries of interest, of Chinese populations and businesses; and
  • Russia’s disinformation machinery is explicitly weaponised as a resource for future wars, weakening a target country’s sense of danger and diminishing the will to resist. China wants acceptance of its legitimacy as a great power while rejecting international standards it does not agree with.

The stream of disinformation also flows from other actors:

  • In the Philippines, disinformation has been used as a tactic to influence voters in the presidential election, justify the street anti-drug campaign, discredit critics, and de-legitimise mainstream media;
  • During the Brexit campaign large numbers of Twitter accounts were active, particularly on the Leave side. Most disappeared immediately after the vote, strongly indicating they were driven by bots. In their content they reflected the hyper-partisan and simplistic style of the British tabloid press.

Independent emergent activists

State disinformation agencies are part of a complex system which includes independent activists with different but overlapping motivations. Many see hidden conspiracies behind headline events such as mass shootings, or even deny that they happened. They believe Western governments are untrustworthy, manipulate world events, and are aided in hiding the truth by the traditional media. Most are anti-globalist, with a nationalist and anti-immigration rhetoric that attracts elements of both the left and right.

Independent actors use social media and specialised web sites to strategically reinforce and spread messages compatible with their own. Their networks are infiltrated and used by state media disinformation organisations to amplify the state’s own disinformation strategies against target populations. The extent to which activities within this complex system are orchestrated, and by whom, remains unclear.

Agents of disinformation: The enablers

The information ecosystem enables large-scale disinformation campaigns. False news is spread in many ways, but Facebook and Twitter are especially important tools. Both are used to target specific population segments. Individuals accept the false news as credible or useful, and spread it further. State agencies make extensive use of bots and phoney accounts to popularise false news stories, and spread them in cascading volumes impossible for human actors to produce or vet individually.

Social media companies are becoming aware of their role in the problem, but not all Silicon Valley leaders are convinced of their responsibility to eliminate false news. Fighting spam is a business necessity, but terminating accounts or checking content constrains profitability. Social media companies have a philosophical commitment to the open sharing of information, and many have a limited understanding of the world of intelligence operations. They are reluctant to ally with intelligence agencies and mainstream news organisations to take up the detailed task of monitoring content.

Russian disinformation: The messages

Russian disinformation is adjusted to circumstances and state objectives, but there are persistent major themes according to which, for example, Western governments are fascist, or world leaders represent a powerful elite disdainful of, and acting against, ordinary people.

To these general themes are added those which support specific campaigns, such as Russian activity to support the Republican Party during the 2016 presidential campaign in the United States.

The reaction

Multiple actors and agencies are working to counter and defend against this threat:

  • Governments are increasingly insisting that social media companies take responsibility for the content they facilitate. European legislators are ahead of those in the US, in part because social media is heavily used by terrorists;
  • Some governments have moved to block known disinformation media streams in their countries, shielding their citizens from attempts at foreign influence;
  • Many universities and private research groups have analysed disinformation campaigns, using distribution patterns and content indicators to identify bot networks and troll factories; and
  • Specialised organisations have become skilled at exposing false news stories and, often in real time, educating the public to identify and expose disinformation.

Outlook

The negative impact on democracy of false news could increase if Russia and other actors become role models for others, increasing the distribution of malignant material through all the pathways of the electronic age.

Disinformation poisons public debate and is a threat to democracy. Raised public awareness is needed to distinguish the real from the false. There are many ways for governments and organisations to counter the threat, but there is no guarantee that even effective counter-campaigns can defeat the high volume flow of malicious communications.

Page details

Date modified: