Social media is being used to stage manage our democracy using nudge-based strategies

 

Image result for Online propaganda

A study from the University of Oxford, published this month, has concluded what many of us already know: bots, shills and trolls are working together to spread propaganda and disinformation, disrupt discussions, discredit individuals and are attempting to manipulate social media users’ political views.

The report warns: “Computational propaganda is one of the most powerful new tools against democracy.” 

The Oxford Internet Institute says that computational propaganda is the use of algorithms, automation, and human curation to purposefully distribute misleading information over social media networks. Social media are actively used as a tool for public opinion manipulation in diverse ways and on various topics.

Bots and trolls work to stifle authentic and reasoned debate between people in favour of a social network populated by (usually aggressive) argument and soundbites and they can simply make online measures of social support, such as the number of  “likes” (which can, of course, be bought), look larger  – crucial in creating the illusion of consensus and encouraging a bandwaggon effect.

In democracies, social media are actively used for computational propaganda, through broad efforts at opinion manipulation and by targeted experiments on particular segments of the public (which is antidemocratic in itself). This strategy isn’t so far removed from the “big data” approach, where individuals are targeted in election campaigns to receive personal messages that are highly tailored, designed to appeal to certain categories of “personality types” as discerned by the use of extensive data mining and psychological profiling techniques. 

The report also says that “In every country we found civil society groups trying, but struggling, to protect themselves and respond to active disinformation campaigns.” 

The research team involved 12 researchers across nine countries who, altogether, interviewed 65 experts, analyzed tens of millions posts on seven different social media platforms during scores of elections, political crises, and national security incidents.

They say that in democracies, individual users design and operate fake and highly automated social media accounts. Political candidates, campaigns and lobbyists rent larger networks of accounts for purpose-built campaigns while governments assign public resources to the creation, experimentation and use of such accounts.

Ultimately the presence of bots, shils and trolls on social media is a right-wing bid to stage manage our democracy, in much the same way that the biggest proportion of the rabidly right-wing corporate media has, until recently.

The report describes online propaganda as a “phenomenon that encompasses recent digital misinformation and manipulation efforts”, which “involves learning from and mimicking real people so as to manipulate public opinion across a diverse range of platforms and device networks”.

According to the report, bots “played a small but strategic role” in shaping Twitter conversations during the EU referendum last year. Bots work most effectively and powerfully when working together with trolls.

Political bots, social media bots used for political manipulation, are also effective tools for strengthening online propaganda and hate campaigns. One person, or a small group of people, can use an army of political bots on Twitter to give the illusion of large-scale consensus. Bots are increasingly being used for malicious activities associated with spamming and harassment.

According to the report authors: “The family of hashtags associated with the argument for leaving the EU dominated, while less than one percent of sampled accounts generated almost a third of all the messages.”

Political bots, built to look and act like real citizens, are being deployed in determined anti-democratic efforts to silence oppostion and to push official state messaging. Political campaigners, and their supporters, deploy political bots – and computational propaganda more broadly – during elections in attempts to sway the vote and defame critics. 

Anonymous political actors harness key elements of computational propaganda such as false news reports, coordinated disinformation campaigns, and troll mobs to attack human rights defenders civil society groups, and independent commentators and journalists.

The report warns “Computational propaganda is one of the most powerful new tools against democracy.” Facebook in particular has attracted a great deal of criticism in recent months, due to the rise and promotion of fake news.

Mark Zuckerberg initially denied that false stories spread through the social network had an effect on the US Presidential election, but changed his stance soon after.

The University of Oxford report says social media sites need to redesign themselves in order to regain trust.

The role of Intelligence Services in the deployment of psy-ops

 

In 2015, Glenn Greenwauld published a series of documents from the Joint Threat Research Intelligence Group (JTRIG). He says that though its existence was secret until 2014, JTRIG quickly developed a distinctive profile in the public understanding, after documents from the National Security Agency (NSA) whistleblower Edward Snowden revealed that the unit had engaged in “dirty tricks” like deploying sexual “honey traps” designed to discredit targets, launching denial-of-service attacks to shut down internet chat rooms, pushing veiled propaganda onto social networks and generally warping discourse online. 

JTRIG’s tactics include seeding propaganda on social media, impersonating people online, and creating false blog posts to discredit targets.

A fascinating and must-read 42-page document from 2011 is particularly revealing, detailing JTRIG’s activities. It provides the most comprehensive and sweeping insight to date into the scope of this unit’s extreme methods. Entitled “Behavioral Science Support for JTRIG’s Effects and Online HUMINT [Human Intelligence] Operations,” it describes the types of targets on which the unit focuses, the psychological and behavioral research it commissions and exploits, and its future organizational aspirations.

The document is authored by a psychologist, Mandeep K. Dhami, a professor of  “Decision Psychology”. Dhami has provided advice on how JTRIG can improve its approach and attain “desired outcomes”, for example, by applying behavioural theories and research around persuasive communication, compliance, obedience, conformity, and the creation of trust and distrust.

Among other things, the document lays out tactics that the agency uses to manipulate public opinion, its scientific and psychological research into how human thinking and behaviour can be profiled and influenced, and the broad range of targets that are traditionally the province of law enforcement rather than intelligence agencies.

Since the general election in the UK, there has been a noticably massive increase in right-wing trolling presence and activity on Twitter. Most of the activity is directed towards discrediting Jeremy Corbyn. It’s very easy to spot a troll. They make outrageous claims that often read like tabloid headlines, resort quickly to personal attacks and attempts to discredit and smear when you disagree, and they never debate reasonably or evidence their comments.

In my experience, some, however, may initially engage reasonably, make a few concessions to evidenced debate, then suddenly show their true colours, by moving the goalposts of the debate constantly to include more disinformation, and by becoming aggressive, very personal and exceedingly irrational. My own management strategy is to address the claims made with a little evidence and fact, and block unhesitantly when it invariably turns ugly. 

The Oxford University research report concludes: “For democracies, we should assume that encouraging people to vote is a good thing. Promoting political news and information from reputable outlets is crucial. Ultimately, designing for democracy, in systematic ways, will help restore trust in social media systems. 

Computational propaganda is now one of the most powerful tools against democracy. Social media firms may not be creating this nasty content, but they are the platform for it. 

“They need to significantly redesign themselves if democracy is going to survive social media.”


Further reading

From the Intercept:

THE “CUBAN TWITTER” SCAM IS A DROP IN THE INTERNET PROPAGANDA BUCKET 

CONTROVERSIAL GCHQ UNIT ENGAGED IN DOMESTIC LAW ENFORCEMENT, ONLINE PROPAGANDA, PSYCHOLOGY RESEARCH

How Covert Agents Infiltrate the Internet to Manipulate, Deceive, and Destroy Reputations – Glenn Greenwauld

Theresa May pledges to create new internet that would be controlled and regulated by government 

The media need a nudge: the government using ‘behavioural science’ to manipulate the public isn’t a recent development, nudging has been happening since 2010

 

Image result for online intelligence propaganda operations


I don’t make any money from my work. But you can support Politics and Insights and contribute by making a donation which will help me continue to research and write informative, insightful and independent articles, and to provide support to others. The smallest amount is much appreciated, and helps to keep my articles free and accessible to all – thank you.

DonatenowButton

15 thoughts on “Social media is being used to stage manage our democracy using nudge-based strategies

  1. It seems that Oxford University, didn’t cameron go there? has missed the the largest propaganda producing machine in the referendum which was the British Government a right wing attempt to keep us in the eu.

    Like

Leave a comment