“We exploited Facebook to harvest millions of profiles. And built models to exploit that and target their inner demons”. Cambridge Analytica Whistleblower, Christopher Wylie.
It’s been a longstanding major area of concern, of course, that neurotechnologies and ‘behavioural change’ techniques may be used to redirect citizen decision making without their explicit permission. After all, neuromarketing – the idea that the brain, behaviours, emotions and preferences can reveal hidden and profitable truths – is founded on the development of strategies of persuasion in order to profit.
This doesn’t just raise ethical concerns in the market place, since neuromarketing strategies are being used in wider contexts, such as in shaping political narratives and communications, election campaigning, policy making and within the media. The motive for employing these techniques is nonetheless about gaining a profit, if not financially, then certainly in terms of advantage and power.
I have criticised behavioural economics extensively and frquently on previous occasions, for precisely the same reasons. Since 2010, it has somehow become acceptable for governments to exercise an influence on the decision-making and behaviours of citizens. Libertarian paternalism, under the guise of ‘behavioural science’, has normalised a manipulative, authoritarian approach for state micro-management of the perceptions, decisions and behaviours of populations. However, being a political doctrine itself, libertarian paternalism is not value-neutral or ‘objective’.
Behavioural economics is a flagrant political misuse of psychology, a form of manipulation without the publics’ knowledge and consent. This of course has profound implications for democracy, as the state is ‘acting upon’ citizens in ways that they won’t recognise to change their behaviours and to manipulate their decision-making. In fact the government’s use of behavioural economics turns democracy completely on its head.
It’s accepted uncritically that people can pay companies and organisations to change people’s minds and persuade them to change their decisions and behaviours, be it simply aimed politically at individuals’ perceived ‘faulty’ decision-making, allegedly involved in their circumstances of poverty, claiming welfare support, or voting for a party that hasn’t paid a PR company to manipulate your voting decision.
Harvard Law Professor Cass Sunstein, (co-author of “Nudge” and one of the founders of behavioural economics), wrote a controversial paper in 2008 proposing that the US government employ teams of covert agents and pseudo-independent advocates to “cognitively infiltrate” online groups and websites, as well as other activist groups.
Sunstein also proposed sending covert agents into “chat rooms, online social networks, or even real-space groups” which spread what he views as false and damaging “conspiracy theories” about the government. Ironically, the very same Sunstein was named by Obama to serve as a member of the NSA review panel created by the White House, one that – while disputing key NSA claims – proceeded to propose many cosmetic reforms to the agency’s powers (most of which were ignored by the President who appointed them).
Back in 2014, GCHQ documents released from the Edward Snowden archive by Glenn Greenwald, were the first to prove that a major western government is using some of the most controversial techniques to disseminate deception online and harm the reputations of targets. The ultimate aim, of course, is to shape public perceptions, decisions and behaviours.
Under the tactics they use, the state is deliberately spreading lies and misinformation on the internet about whichever individuals it targets, including the use of what GCHQ itself calls “false flag operations” and emails to people’s families and friends. The Snowden archive outlines how western intelligence agencies are attempting to manipulate and control online discourse with extreme tactics of deception and reputation-destruction.
Who would possibly trust a government to exercise these powers at all, let alone do so in secret, with virtually no oversight, and outside of any cognizable legal framework?
Then there is, as I’ve discussed, the political misuse of psychology and other social sciences to not only understand, but shape and control, how online activism and discourse unfolds.
Glenn Greenwald’s published document on the Intercept touts the work of GCHQ’s “Human Science Operations Cell,” devoted to “online human intelligence” and “strategic influence and disruption.” Under the title “Online Covert Action”, the document details a variety of means to engage in “influence and info ops” as well as “disruption and computer net attack,” while dissecting how human beings can be manipulated using “leaders,” “trust,” “obedience” and “compliance”.
It’s not such a big inferential leap to conclude that governments are attempting to manage legitimate criticism and opposition while stage-managing our democracy.
I don’t differentiate a great deal between the behavioural insights team at the heart of the Conservative cabinet office, and the dark world of PR and ‘big data’ and ‘strategic communications’ companies like Cambridge Analytica. The political misuse of psychology has been disguised as some kind of technocratic “fix” for a failing neoliberal paradigm, and paraded as neutral “science”.
However, its role as an authoritarian prop for an ideological imposition on the population has always been apparent to some of us, because the bottom line is that it is all about influencing people’s perceptions and decisions, using psychological warfare strategies.
The Conservatives’ behaviour change agenda is designed to align citizen’s perceptions and behaviours with neoliberal ideology and the interests of the state. However, in democratic societies, governments are traditionally elected to reflect and meet public needs. The use of “behaviour change” policy involves the state acting upon individuals, and instructing them how they must be.
Last year, I wrote a detailed article about some of these issues, including discussion of Cambridge Analytica’s involvement in data mining and the political ‘dark’ advertising that is only seen by its intended recipients. This is a much greater cause for concern than “fake news” in the spread of misinformation, because it is invisible to everyone but the person being targeted. This means that the individually tailored messages are not open to public scrutiny, nor are they fact checked.
A further problem is that no-one is monitoring the impact of the tailored messages and the potential to cause harm to individuals. The dark adverts are designed to exploit people’s psychological vulnerabilities, using personality profiling, which is controversial in itself. Intentionally generating and manipulating fear and anxiety to influence political outcomes isn’t a new thing. Despots have been using fear and slightly less subtle types of citizen “behaviour change” programmes for a long time.
About Cambridge Analytica: political psyops approach verified by a whistleblower
Controversy has arisen concerning Cambridge Analytica‘s use of personal information acquired by an external researcher, who claimed to be collecting it for “academic purposes”. The use of personal data collected without knowledge or permission to establish sophisticated models of user’s personalities raises ethical and privacy issues.
In a somewhat late response, Facebook banned Cambridge Analytica from advertising on its platform. The Guardian has further reported that Facebook had known about this security breach for two years, but did nothing to protect its millions of users.
It is well-known that Cambridge Analytica (CA) collects data on voters using sources such as demographics, consumer activity and internet activity, among other public and private sources. It has been reported that the company is using psychological data derived from millions of Facebook users, largely without users’ permission or knowledge. In short, the company operates using political voter surveillance and strategies of psychological manipulation.
The data analytics firm is a private company that offers services to businesses and political parties who want to “change audience behaviour”. CA combines data mining and data analysis with ‘strategic communication’ for the electoral process. It was created in 2013 as an offshoot of its British parent company, Strategic Communication Laboratories Group, to participate in US politics.
The company claims to use “data enhancement and audience segmentation techniques” providing “psychographic analysis” for a “deeper knowledge of the target audience”. The company is known to use the ‘big five’ OCEAN scale of personality traits, among other methods of psychographic profiling.
The company also claims to use “behavioural microtargeting” and indicates that it can predict ‘needs’ of subjects and how these needs may change over time. Services then can be individually targeted for the benefit of its clients from the political arena, governments, and companies providing “a better and more actionable view of their key audiences.”
CA, who worked with Donald Trump’s election team and the Brexit campaign, has harvested millions of Facebook profiles of US voters, in one of the technological giant’s biggest ever data breaches, and used them to build a powerful software program to psychologically profile, predict and influence citizens’ voting choices. The managing director of CA’s controversial political division is Mark Turnbull, who spent 18 years at the communications firm Bell Pottinger before joining Strategic Communication Laboratories (SCL), which is a British ‘behavioural science’ company.
The SCL Group, that once advised Nato on so-called ‘psy-ops’, is a private British behavioural research and strategic communication company. The company describes itself as “global election management agency”. SCL’s approach to propaganda is based upon a methodology developed by the associated Behavioural Dynamics Institute (BDI).
Nigel Oakes founded the latter and also set up SCL and using the new methodology from BDI, ran election campaigns and national communications campaigns for a broad variety of international governments.
BDI say: “The goal of the BDI is to establish Behavioural Dynamics as a discipline for the study of group behaviour change.”
There isn’t much information around about BDI’s connection with military operations, though links with NATO are well-established – see Countering propaganda: NATO spearheads use of behavioural change science, for example. From the article: “Target Audience Analysis, a scientific application developed by the UK based Behavioural Dynamics Institute, that involves a comprehensive study of audience groups and forms the basis for interventions aimed at reinforcing or changing attitudes and behaviour.”
SCL on the other hand, has a clearly defined defence and military division who: “Target Audience Analysis, a scientific application developed by the UK based Behavioural Dynamics Institute, that involves a comprehensive study of audience groups and forms the basis for interventions aimed at reinforcing or changing attitudes and behaviour.”
SCL has different ‘verticals’ in politics, military and commercial operations. All of those operations are based on the same methodology (Target Audience Analysis) and, as far as can be discerned from the outside, SCL and affiliates have very obscure corporate structures with confusing ownership.
In the United States, SCL has gained public recognition mainly though its affiliated corporation Cambridge Analytica. It was created in 2013 as an offshoot of its British parent company (the SCL Group,) to participate in US politics. In 2014, CA was involved in 44 US political races.
Their site says: “Cambridge Analytica uses data to change audience behavior.”
There doesn’t seem to be a lot of political will or respect on the right when it comes to the publics’ privacy, autonomy in decision making, citizens’ agency and civil liberties.
The current controversy
Working with a whistleblower and ex-employee of Cambridge Analytica, the Observer and Guardian have seen documents and gathered eyewitness reports that lift the lid on the data analytics company that helped Donald Trump to victory. The company is currently being investigated on both sides of the Atlantic.
It is a key subject in two inquiries in the UK – by the Electoral Commission, into the company’s possible role in the EU referendum and the Information Commissioner’s Office, into data analytics for political purposes – and one in the US, as part of special counsel Robert Mueller’s probe into Trump-Russia collusion.
Previous articles by Carole Cadwalladr in the Observer and Guardian newspapers, respectively published in February and May 2017, speculated in detail that CA had influenced both the Brexit/Vote Leave option in the UK’s 2016 EU membership referendum and Trump’s 2016 US presidential campaign with Robert Mercer’s backing of Donald Trump being key. They also discuss the legality of using the social data farmed. CA says it is pursuing legal action over the claims made in Cadwalladr’s articles.
The whistleblower, Chris Wylie, claims that the 50 million mostly American, profiles were harvested in one of Facebook’s biggest data breaches has caused outrage on both sides of the Atlantic, with lawmakers in both the UK and America, and a state attorney general calling for greater accountability and regulation. The profiles were harvested by a UK-based academic, Aleksandre Kogan, and his company, Global Science Research (GSR).
Wylie said the personal information mined was used to build a system to influence voters. The Canadian, who previously worked for Cambridge Analytica, has lifted the lid on this and other practices at the company, which he describes as a “full-service propaganda machine”.
Shortly before the story broke, Facebook’s external lawyers warned the Observer that it was making “false and defamatory” allegations and reserved Facebook’s legal position. Facebook denies the harvesting of tens of millions of profiles by CA, working with Cambridge academic Aleksandr Kogan and his firm GSR, was a data breach.
While Facebook insists that it wasn’t a data breach, claiming it was a violation by a third party app that abused user data, this responsibility offloading speaks volumes about Facebook’s approach to its users’ privacy.
Private companies benefit from a lack of transparency over how profits are made from our personal data. Their priority seems to be to silo and hoard our data, prioritising its more commercial uses. Yet we need to think about data differently, moving away from ideas of data as a commodity to be bought and sold, and used to generate profit for a few people – be it financial or political profit.
The internet, and later the World Wide Web, was originally intended to be a democratising force, accessible to all and without walls or ownership. But the reality today is rather different. The inequalities in wealth and power inherent in neoliberalism have seeped online, marketising and commodifying our personal details, choices, views, dispositions, likes and dislikes.
Personal data has become the driving force of the online economy, yet the economic and social value which can be generated from data is not remotely fairly distributed. In fact it isn’t being redistributed at all.
Facebook shoot the messenger
Facebook have also suspended the whistleblower Chris Wylie from the platform “pending further information” over misuse of data, along with his former employer, CA and its affiliates, and the academic they worked with, Aleksandr Kogan.
The public attack on Wylie came after he had approached Facebook about the data breach, offering to help investigate. He described it as a “chilling attack” on someone acting in the public interest.
“They acknowledged my offer but then turned around and shot the messenger. I’m trying to make amends for my mistakes and so should Facebook,” he told the Guardian.
“Facebook has known about this for at least two years and did almost nothing to fix it. This is not new. And it’s only by coming forward that Facebook is now taking action. People need to know this kind of profiling is happening.”
Kogan assembled the harvested information through an app on the site – it collected details of American citizens who were paid to take a personality test, but also gathered data on those people’s Facebook friends.
Kogan apparently had a deal to share this information with CA. But according to Wylie, most of this personal information had been taken without authorisation. He said Cambridge Analytica used it to build a powerful software program to predict and influence choices at the ballot box.
Last month, both Facebook and CA CEO Alexander Nix told the parliamentary inquiry into fake news that the company did not have or use private Facebook data, or any data from Kogan’s firm, GSR.
But in its statement on Friday night, explaining why it had suspended CA and Wylie, Facebook said it had known in 2015 that profiles were passed to Nix’s company.
“In 2015, we learned that a psychology professor at the University of Cambridge named Dr Aleksandr Kogan lied to us and violated our platform policies by passing data from an app that was using Facebook Login to SCL/Cambridge Analytica,”the statement said.
CA is heavily funded by the family of Robert Mercer, an American hedge-fund billionaire. I’ve mentioned Mercer in a previous article about the right’s undue influence on the media and on voting behaviour. Mercer made his money as a pioneer in the field of Computational Linguistics.
The company was headed by Trump’s key adviser Steve Bannon. CA used personal information taken without authorisation in early 2014 to build a system that could profile individual US voters, in order to target them with ‘personalised’ persuasive political ‘advertisements’.
It’s scandalous that documents seen by the Observer, and confirmed by the Facebook statement, show that by late 2015 the Facebook had found out that information had been harvested on an unprecedented scale and failed to alert users, taking only limited steps to recover and secure the private information of more than 50 million individuals.
Last year, Dr Simon Moores, visiting lecturer in the applied sciences and computing department at Canterbury Christ Church University and a technology ambassador under the Blair government, said the Information commissioners Office’s recent decision to shine a light on the use of big data in politics was timely. He said:
“A rapid convergence in the data mining, algorithmic and granular analytics capabilities of companies like Cambridge Analytica and Facebook is creating powerful, unregulated and opaque ‘intelligence platforms’. In turn, these can have enormous influence to affect what we learn, how we feel, and how we vote. The algorithms they may produce are frequently hidden from scrutiny and we see only the results of any insights they might choose to publish.”
He goes on to say: ”They were using 40-50,000 different variants of an ad every day that were continuously measuring responses and then adapting and evolving based on that response.”
The head of the parliamentary committee investigating fake news has accused CA and Facebook of misleading MPs in their testimony.
After Wylie detailed the harvesting of more than 50 million Facebook profiles for CA, Damian Collins, the chair of the House of Commons culture, media and sport select committee, said he would be calling on the Facebook boss, Mark Zuckerberg, to testify before the committee.
He said the company appeared to have previously sent executives able to avoid difficult questions who had “claimed not to know the answers”.
Collins also said he would be recalling the CA’s CEO, Alexander Nix, to give further testimony. “Nix denied to the committee last month that his company had received any data from [his firm] GSR,” he said. “We will be contacting Alexander Nix next week asking him to explain his comments.”
Collins has attacked Facebook for appearing to have been “deliberately avoiding answering straight questions” in to the committee.
“It is now clear that data has been taken from Facebook users without their consent, and was then processed by a third party and used to support their campaigns,” Collins said. “Facebook knew about this, and the involvement of Cambridge Analytica with it.”
CA claimed that its contract with GSR stipulated that Kogan should seek “informed consent” for data collection and it had no reason to believe he would not.
GSR was “led by a seemingly reputable academic at an internationally renowned institution who made explicit contractual commitments to us regarding its legal authority to license data to SCL Elections”, a company spokesman said.
The Observer has seen a contract dated 4 June 2014, which confirms SCL, an affiliate of CA, entered into a commercial arrangement with GSR, entirely premised on harvesting and processing Facebook data. CA spent nearly $1m on data collection, which yielded more than 50 million individual profiles that could be matched to electoral rolls. It then used the test results and Facebook data to build an algorithm that could analyse individual Facebook profiles and determine personality traits linked to voting behaviour.
The algorithm and database together made a powerful political tool for the right. It allowed a campaign to identify possible swing voters and craft messages more likely to ‘resonate’.
“The ultimate product of the training set is creating a ‘gold standard’ of understanding personality from Facebook profile information,” the contract specifies. It promises to create a database of 2 million ‘matched’ profiles, identifiable and tied to electoral registers, across 11 states, but with room to expand much further.
CA responded to the Observer story on Twitter before Collins had said Nix would be recalled. “We refute(s) these mischaracterizations and false allegations,” it said:
“Reality Check: Cambridge Analytica uses client and commercially and publicly available data; we don’t use or hold any Facebook data,” the company said. “When we learned GSR sold us Facebook data that it shouldn’t have done, we deleted it all – system wide audit to verify.”
In response to the series of defensive Tweets put out by CA, I quoted several claims from CA’s own site, which I had cited in an article last year.
For example, the company offers to: “More effectively engage and persuade voters using specially tailored language and visual ad combinations crafted with insights gleaned from behavioral understandings of your electorate.”
And boasts:“Leveraging CA’s massive team of data scientists and academics, CA is able to provide optimal audience segments based on big data and psychographic modeling. Then, using a sophisticated electronic data delivery system, CA is able to provide TV advertising campaign data that may be used to inform media buyers about shows that have the highest concentrations of target audiences and the least amount of waste; all of which leading to higher media ROI [return on investment] and more voter conversions.”
“Psychographic Modeling”? “Conversions”? “[…] specially tailored language and visual ad combinations crafted with insights gleaned from behavioral understandings of your electorate” ?
That language doesn’t sound like “advertising” to me. It sounds like microsurveilance and psychological manipulation, using the vulnerabilities that make us susceptible to all kinds of manipulations, including the intentional manipulations performed by the political machinery of our culture.
If CA genuinely thought “people are smarter than that”, then their boasts about their service of psychographic modeling, behavioural science; “understandings of the electorates’ behaviour”, “changing voter behaviours” and increasing “conversions”, “driving” voters to the polls to win campaigns and so on is nothing more than an eloborate scam. Why bother attempting to manipulate people you think are not susceptible to manipulation?
Either way, this company has transgressed ethical boundaries, either as snake oil merchants, or as peddlers of snake oil on behalf of governments and other clients, while exploiting our personal data.
“CA Political will equip you with the data and insights necessary to drive your voters to the polls and win your campaign. We offer a proven combination of predictive analytics, behavioral sciences, and data-driven ad tech.”
“With up to 5,000 data points on over 230 million American voters, we build your custom target audience, then use this crucial information to engage, persuade, and motivate them to act.”
And offers to help to: “More effectively engage and persuade voters using specially tailored language and visual ad combinations crafted with insights gleaned from behavioral understandings of your electorate.”
One of our fundamental freedoms, as human beings, is that of owning the decision making regarding our own lives and experiences, including evaluating and deciding our own political preferences. To be responsible for our own thoughts, reflections, intentions and actions is generally felt to be an essential part of what it means to be human.
When David Cameron said that “knowledge of human behaviour” was part of his vision for a “new age of government” I was one of a few who didn’t see behavioural economics as the great breakthrough in social policy-making that it was being hailed as. Even the name ‘behavioural insights team’ suggests secrecy, surveilance and manipulation. It was only a matter of time before libertarian paternalism morphed into authoritarianism, hidden in plain view.
We are being told what our ‘best interests’ are by a small group of powerful people whose interests are that want to stay powerful, despite being dogmatic, self-righteous and wrong. Despite the fact that they need specialists in techniques of persuasion, rather than rational and democratic engagement, to appear credible to the electorate.
It seems that the overarching logic of New Right neoliberalism has led to the privatisation of citizens’ decision making and behaviour and a new form of exploiting the population by misuse of their trust and their personal information.
Also, it seems democracy has been commodified and marketised.
Cambridge Analytica are trying to stop the broadcast of an undercover Channel 4 News report in which its chief executive talks unguardedly about its practices. Channel 4 reporters posed as prospective clients and had a series of meetings with Cambridge Analytica that they secretly filmed — including at least one with Alexander Nix, its chief executive.
Channel 4 declined to comment. Cambridge Analytica’s spokesman also declined to comment on the undercover Channel 4 report. The company is under mounting pressure over how it uses personal data in political and election campaign work. It was banned by Facebook on Friday, which claimed it had violated the social network’s rules by failing to delete Facebook user data collected by an app supposedly for ‘research purposes’.
Facebook is now investigating ties between one of its current employees and Cambridge Analytica. Joseph Chancellor, currently a researcher at Facebook, was a director of Global Science Research, a company that provided data to Cambridge Analytica.
The nature of Chancellor’s role as a director of Global Science Research and his knowledge of Kogan’s data collection practices are not clear. A spokesperson for Cambridge Analytica said “there was no recollection of any interactions or emails with” Chancellor.
Facebook didn’t mention Global Science Research. But Cambridge Analytica said on Saturday that it contracted the company in 2014 to “undertake a large scale research project in the United States.”
Global Science Research was incorporated in May 2014 and listed Kogan and Chancellor as directors, according to UK government records. (The records show that Global Science Research was dissolved in October 2017.)
Channel 4 News went ahead to broadcast the Cambridge Analytica exposé despite the legal threat.
Watch Channel 4′s excellent undercover documentary.
Cambridge Analytica questioned on fake news – UK parliament
Controversial GCHQ Unit Engaged in Domestic Law Enforcement, Online Propaganda, Psychology Research – Glenn Greenwald and Andrew Fishman
I don’t make any money from my work. But you can support Politics and Insights and contribute by making a donation which will help me continue to research and write informative, insightful and independent articles, and to provide support to others. The smallest amount is much appreciated, and helps to keep my articles free and accessible to all – thank you.