“We exploited Facebook to harvest millions of profiles. And built models to exploit that and target their inner demons”. Cambridge Analytica Whistleblower, Christopher Wylie.
It’s been a longstanding major area of concern, of course, that neurotechnologies and ‘behavioural change’ techniques may be used to redirect citizen decision making without their explicit permission. After all, neuromarketing – the idea that the brain, behaviours, emotions and preferences can reveal hidden and profitable truths – is founded on the development of strategies of persuasion in order to profit.
This doesn’t just raise ethical concerns in the market place, since neuromarketing strategies are being used in wider contexts, such as in shaping political narratives and communications, election campaigning, policy making and within the media. The motive for employing these techniques is nonetheless about gaining a profit, if not financially, then certainly in terms of advantage and power.
I have criticised behavioural economics extensively and frequently on previous occasions, for precisely the same reasons. Since 2010, it has somehow become acceptable for governments to exercise an influence on the decision-making and behaviours of citizens. Libertarian paternalism, under the guise of ‘behavioural science’, has normalised a manipulative, authoritarian approach for state micro-management of the perceptions, decisions and behaviours of populations. However, being a political doctrine itself, libertarian paternalism is not value-neutral or ‘objective’.
Behavioural economics is a flagrant political misuse of psychology, a form of manipulation without the publics’ knowledge and consent. This of course has profound implications for democracy, as the state is ‘acting upon’ citizens in ways that they won’t recognise to change their behaviours and to manipulate their decision-making. In fact the government’s use of behavioural economics turns democracy completely on its head.
It’s accepted uncritically that people can pay companies and organisations to change people’s minds and persuade them to change their decisions and behaviours, be it simply aimed politically at individuals’ perceived ‘faulty’ decision-making, allegedly involved in their circumstances of poverty, claiming welfare support, or voting for a party that hasn’t paid a PR company to manipulate your voting decision.
Harvard Law Professor Cass Sunstein, (co-author of “Nudge” and one of the founders of behavioural economics), wrote a controversial paper in 2008 proposing that the US government employ teams of covert agents and pseudo-independent advocates to “cognitively infiltrate” online groups and websites, as well as other activist groups.
Sunstein also proposed sending covert agents into “chat rooms, online social networks, or even real-space groups” which spread what he views as false and damaging “conspiracy theories” about the government. Ironically, the very same Sunstein was named by Obama to serve as a member of the NSA review panel created by the White House, one that – while disputing key NSA claims – proceeded to propose many cosmetic reforms to the agency’s powers (most of which were ignored by the President who appointed them).
Back in 2014, GCHQ documents released from the Edward Snowden archive by Glenn Greenwald, were the first to prove that a major western government is using some of the most controversial techniques to disseminate deception online and harm the reputations of targets. The ultimate aim, of course, is to shape public perceptions, decisions and behaviours.
Under the tactics they use, the state is deliberately spreading lies and misinformation on the internet about whichever individuals it targets, including the use of what GCHQ itself calls “false flag operations” and emails to people’s families and friends. The Snowden archive outlines how western intelligence agencies are attempting to manipulate and control online discourse with extreme tactics of deception and reputation-destruction.
Who would possibly trust a government to exercise these powers at all, let alone do so in secret, with virtually no oversight, and outside of any cognizable legal framework?
Then there is, as I’ve discussed, the political misuse of psychology and other social sciences to not only understand, but shape and control, how online activism and discourse unfolds.
Glenn Greenwald’s published document on the Intercept touts the work of GCHQ’s “Human Science Operations Cell,” devoted to “online human intelligence” and “strategic influence and disruption.” Under the title “Online Covert Action”, the document details a variety of means to engage in “influence and info ops” as well as “disruption and computer net attack,” while dissecting how human beings can be manipulated using “leaders,” “trust,” “obedience” and “compliance”.
It’s not such a big inferential leap to conclude that governments are attempting to manage legitimate criticism and opposition while stage-managing our democracy.
I don’t differentiate a great deal between the behavioural insights team at the heart of the Conservative cabinet office, and the dark world of PR and ‘big data’ and ‘strategic communications’ companies like Cambridge Analytica. The political misuse of psychology has been disguised as some kind of technocratic “fix” for a failing neoliberal paradigm, and paraded as neutral “science”.
However, its role as an authoritarian prop for an ideological imposition on the population has always been apparent to some of us, because the bottom line is that it is all about influencing people’s perceptions and decisions, using psychological warfare strategies.
The Conservatives’ behaviour change agenda is designed to align citizen’s perceptions and behaviours with neoliberal ideology and the interests of the state. However, in democratic societies, governments are traditionally elected to reflect and meet public needs. The use of “behaviour change” policy involves the state acting upon individuals, and instructing them how they must be.
Last year, I wrote a detailed article about some of these issues, including discussion of Cambridge Analytica’s involvement in data mining and the political ‘dark’ advertising that is only seen by its intended recipients. This is a much greater cause for concern than “fake news” in the spread of misinformation, because it is invisible to everyone but the person being targeted. This means that the individually tailored messages are not open to public scrutiny, nor are they fact checked.
A further problem is that no-one is monitoring the impact of the tailored messages and the potential to cause harm to individuals. The dark adverts are designed to exploit people’s psychological vulnerabilities, using personality profiling, which is controversial in itself. Intentionally generating and manipulating fear and anxiety to influence political outcomes isn’t a new thing. Despots have been using fear and slightly less subtle types of citizen “behaviour change” programmes for a long time.
About Cambridge Analytica: political psyops approach verified by a whistleblower
Controversy has arisen concerning Cambridge Analytica‘s use of personal information acquired by an external researcher, who claimed to be collecting it for “academic purposes”. The use of personal data collected without knowledge or permission to establish sophisticated models of user’s personalities raises ethical and privacy issues.
In a somewhat late response, Facebook banned Cambridge Analytica from advertising on its platform. The Guardian has further reported that Facebook had known about this security breach for two years, but did nothing to protect its millions of users.
It is well-known that Cambridge Analytica (CA) collects data on voters using sources such as demographics, consumer activity and internet activity, among other public and private sources. It has been reported that the company is using psychological data derived from millions of Facebook users, largely without users’ permission or knowledge. In short, the company operates using political voter surveillance and strategies of psychological manipulation.
The data analytics firm is a private company that offers services to businesses and political parties who want to “change audience behaviour”. CA combines data mining and data analysis with ‘strategic communication’ for the electoral process. It was created in 2013 as an offshoot of its British parent company, Strategic Communication Laboratories Group, to participate in US politics.
The company claims to use “data enhancement and audience segmentation techniques” providing “psychographic analysis” for a “deeper knowledge of the target audience”. The company is known to use the ‘big five’ OCEAN scale of personality traits, among other methods of psychographic profiling.
The company also claims to use “behavioural microtargeting” and indicates that it can predict ‘needs’ of subjects and how these needs may change over time. Services then can be individually targeted for the benefit of its clients from the political arena, governments, and companies providing “a better and more actionable view of their key audiences.”
CA, who worked with Donald Trump’s election team and the Brexit campaign, has harvested millions of Facebook profiles of US voters, in one of the technological giant’s biggest ever data breaches, and used them to build a powerful software program to psychologically profile, predict and influence citizens’ voting choices. The managing director of CA’s controversial political division is Mark Turnbull, who spent 18 years at the communications firm Bell Pottinger before joining Strategic Communication Laboratories (SCL), which is a British ‘behavioural science’ company.
The SCL Group, that once advised Nato on so-called ‘psy-ops’, is a private British behavioural research and strategic communication company. The company describes itself as “global election management agency”. SCL’s approach to propaganda is based upon a methodology developed by the associated Behavioural Dynamics Institute (BDI).
Nigel Oakes founded the latter and also set up SCL and using the new methodology from BDI, ran election campaigns and national communications campaigns for a broad variety of international governments.
BDI say: “The goal of the BDI is to establish Behavioural Dynamics as a discipline for the study of group behaviour change.”
There isn’t much information around about BDI’s connection with military operations, though links with NATO are well-established – see Countering propaganda: NATO spearheads use of behavioural change science, for example. From the article: “Target Audience Analysis, a scientific application developed by the UK based Behavioural Dynamics Institute, that involves a comprehensive study of audience groups and forms the basis for interventions aimed at reinforcing or changing attitudes and behaviour.”
SCL on the other hand, has a clearly defined defence and military division who: “Target Audience Analysis, a scientific application developed by the UK based Behavioural Dynamics Institute, that involves a comprehensive study of audience groups and forms the basis for interventions aimed at reinforcing or changing attitudes and behaviour.”
SCL has different ‘verticals’ in politics, military and commercial operations. All of those operations are based on the same methodology (Target Audience Analysis) and, as far as can be discerned from the outside, SCL and affiliates have very obscure corporate structures with confusing ownership.
In the United States, SCL has gained public recognition mainly though its affiliated corporation Cambridge Analytica. It was created in 2013 as an offshoot of its British parent company (the SCL Group,) to participate in US politics. In 2014, CA was involved in 44 US political races.
Their site says: “Cambridge Analytica uses data to change audience behavior.”
There doesn’t seem to be a lot of political will or respect on the right when it comes to the publics’ privacy, autonomy in decision making, citizens’ agency and civil liberties.
The current controversy
Working with a whistleblower and ex-employee of Cambridge Analytica, the Observer and Guardian have seen documents and gathered eyewitness reports that lift the lid on the data analytics company that helped Donald Trump to victory. The company is currently being investigated on both sides of the Atlantic.
It is a key subject in two inquiries in the UK – by the Electoral Commission, into the company’s possible role in the EU referendum and the Information Commissioner’s Office, into data analytics for political purposes – and one in the US, as part of special counsel Robert Mueller’s probe into Trump-Russia collusion.
Previous articles by Carole Cadwalladr in the Observer and Guardian newspapers, respectively published in February and May 2017, speculated in detail that CA had influenced both the Brexit/Vote Leave option in the UK’s 2016 EU membership referendum and Trump’s 2016 US presidential campaign with Robert Mercer’s backing of Donald Trump being key. They also discuss the legality of using the social data farmed. CA says it is pursuing legal action over the claims made in Cadwalladr’s articles.
The whistleblower, Chris Wylie, claims that the 50 million mostly American, profiles were harvested in one of Facebook’s biggest data breaches has caused outrage on both sides of the Atlantic, with lawmakers in both the UK and America, and a state attorney general calling for greater accountability and regulation. The profiles were harvested by a UK-based academic, Aleksandre Kogan, and his company, Global Science Research (GSR).
Wylie said the personal information mined was used to build a system to influence voters. The Canadian, who previously worked for Cambridge Analytica, has lifted the lid on this and other practices at the company, which he describes as a “full-service propaganda machine”.
Shortly before the story broke, Facebook’s external lawyers warned the Observer that it was making “false and defamatory” allegations and reserved Facebook’s legal position. Facebook denies the harvesting of tens of millions of profiles by CA, working with Cambridge academic Aleksandr Kogan and his firm GSR, was a data breach.
While Facebook insists that it wasn’t a data breach, claiming it was a violation by a third party app that abused user data, this responsibility offloading speaks volumes about Facebook’s approach to its users’ privacy.
Private companies benefit from a lack of transparency over how profits are made from our personal data. Their priority seems to be to silo and hoard our data, prioritising its more commercial uses. Yet we need to think about data differently, moving away from ideas of data as a commodity to be bought and sold, and used to generate profit for a few people – be it financial or political profit.
The internet, and later the World Wide Web, was originally intended to be a democratising force, accessible to all and without walls or ownership. But the reality today is rather different. The inequalities in wealth and power inherent in neoliberalism have seeped online, marketising and commodifying our personal details, choices, views, dispositions, likes and dislikes.
Personal data has become the driving force of the online economy, yet the economic and social value which can be generated from data is not remotely fairly distributed. In fact it isn’t being redistributed at all.
Facebook shoot the messenger
Facebook have also suspended the whistleblower Chris Wylie from the platform “pending further information” over misuse of data, along with his former employer, CA and its affiliates, and the academic they worked with, Aleksandr Kogan.
The public attack on Wylie came after he had approached Facebook about the data breach, offering to help investigate. He described it as a “chilling attack” on someone acting in the public interest.
“They acknowledged my offer but then turned around and shot the messenger. I’m trying to make amends for my mistakes and so should Facebook,” he told the Guardian.
“Facebook has known about this for at least two years and did almost nothing to fix it. This is not new. And it’s only by coming forward that Facebook is now taking action. People need to know this kind of profiling is happening.”
Kogan assembled the harvested information through an app on the site – it collected details of American citizens who were paid to take a personality test, but also gathered data on those people’s Facebook friends.
Kogan apparently had a deal to share this information with CA. But according to Wylie, most of this personal information had been taken without authorisation. He said Cambridge Analytica used it to build a powerful software program to predict and influence choices at the ballot box.
Last month, both Facebook and CA CEO Alexander Nix told the parliamentary inquiry into fake news that the company did not have or use private Facebook data, or any data from Kogan’s firm, GSR.
But in its statement on Friday night, explaining why it had suspended CA and Wylie, Facebook said it had known in 2015 that profiles were passed to Nix’s company.
“In 2015, we learned that a psychology professor at the University of Cambridge named Dr Aleksandr Kogan lied to us and violated our platform policies by passing data from an app that was using Facebook Login to SCL/Cambridge Analytica,”the statement said.
CA is heavily funded by the family of Robert Mercer, an American hedge-fund billionaire. I’ve mentioned Mercer in a previous article about the right’s undue influence on the media and on voting behaviour. Mercer made his money as a pioneer in the field of Computational Linguistics.
The company was headed by Trump’s key adviser Steve Bannon. CA used personal information taken without authorisation in early 2014 to build a system that could profile individual US voters, in order to target them with ‘personalised’ persuasive political ‘advertisements’.
It’s scandalous that documents seen by the Observer, and confirmed by the Facebook statement, show that by late 2015 the Facebook had found out that information had been harvested on an unprecedented scale and failed to alert users, taking only limited steps to recover and secure the private information of more than 50 million individuals.
Last year, Dr Simon Moores, visiting lecturer in the applied sciences and computing department at Canterbury Christ Church University and a technology ambassador under the Blair government, said the Information commissioners Office’s recent decision to shine a light on the use of big data in politics was timely. He said:
“A rapid convergence in the data mining, algorithmic and granular analytics capabilities of companies like Cambridge Analytica and Facebook is creating powerful, unregulated and opaque ‘intelligence platforms’. In turn, these can have enormous influence to affect what we learn, how we feel, and how we vote. The algorithms they may produce are frequently hidden from scrutiny and we see only the results of any insights they might choose to publish.”
He goes on to say: ”They were using 40-50,000 different variants of an ad every day that were continuously measuring responses and then adapting and evolving based on that response.”
The head of the parliamentary committee investigating fake news has accused CA and Facebook of misleading MPs in their testimony.
After Wylie detailed the harvesting of more than 50 million Facebook profiles for CA, Damian Collins, the chair of the House of Commons culture, media and sport select committee, said he would be calling on the Facebook boss, Mark Zuckerberg, to testify before the committee.
He said the company appeared to have previously sent executives able to avoid difficult questions who had “claimed not to know the answers”.
Collins also said he would be recalling the CA’s CEO, Alexander Nix, to give further testimony. “Nix denied to the committee last month that his company had received any data from [his firm] GSR,” he said. “We will be contacting Alexander Nix next week asking him to explain his comments.”
Collins has attacked Facebook for appearing to have been “deliberately avoiding answering straight questions” in to the committee.
“It is now clear that data has been taken from Facebook users without their consent, and was then processed by a third party and used to support their campaigns,” Collins said. “Facebook knew about this, and the involvement of Cambridge Analytica with it.”
CA claimed that its contract with GSR stipulated that Kogan should seek “informed consent” for data collection and it had no reason to believe he would not.
GSR was “led by a seemingly reputable academic at an internationally renowned institution who made explicit contractual commitments to us regarding its legal authority to license data to SCL Elections”, a company spokesman said.
The Observer has seen a contract dated 4 June 2014, which confirms SCL, an affiliate of CA, entered into a commercial arrangement with GSR, entirely premised on harvesting and processing Facebook data. CA spent nearly $1m on data collection, which yielded more than 50 million individual profiles that could be matched to electoral rolls. It then used the test results and Facebook data to build an algorithm that could analyse individual Facebook profiles and determine personality traits linked to voting behaviour.
The algorithm and database together made a powerful political tool for the right. It allowed a campaign to identify possible swing voters and craft messages more likely to ‘resonate’.
“The ultimate product of the training set is creating a ‘gold standard’ of understanding personality from Facebook profile information,” the contract specifies. It promises to create a database of 2 million ‘matched’ profiles, identifiable and tied to electoral registers, across 11 states, but with room to expand much further.
CA responded to the Observer story on Twitter before Collins had said Nix would be recalled. “We refute(s) these mischaracterizations and false allegations,” it said:
“Reality Check: Cambridge Analytica uses client and commercially and publicly available data; we don’t use or hold any Facebook data,” the company said. “When we learned GSR sold us Facebook data that it shouldn’t have done, we deleted it all – system wide audit to verify.”
In response to the series of defensive Tweets put out by CA, I quoted several claims from CA’s own site, which I had cited in an article last year.
For example, the company offers to: “More effectively engage and persuade voters using specially tailored language and visual ad combinations crafted with insights gleaned from behavioral understandings of your electorate.”
And boasts:“Leveraging CA’s massive team of data scientists and academics, CA is able to provide optimal audience segments based on big data and psychographic modeling. Then, using a sophisticated electronic data delivery system, CA is able to provide TV advertising campaign data that may be used to inform media buyers about shows that have the highest concentrations of target audiences and the least amount of waste; all of which leading to higher media ROI [return on investment] and more voter conversions.”
“Psychographic Modeling”? “Conversions”? “[…] specially tailored language and visual ad combinations crafted with insights gleaned from behavioral understandings of your electorate” ?
That language doesn’t sound like “advertising” to me. It sounds like microsurveilance and psychological manipulation, using the vulnerabilities that make us susceptible to all kinds of manipulations, including the intentional manipulations performed by the political machinery of our culture.
If CA genuinely thought “people are smarter than that”, then their boasts about their service of psychographic modeling, behavioural science; “understandings of the electorates’ behaviour”, “changing voter behaviours” and increasing “conversions”, “driving” voters to the polls to win campaigns and so on is nothing more than an eloborate scam. Why bother attempting to manipulate people you think are not susceptible to manipulation?
Either way, this company has transgressed ethical boundaries, either as snake oil merchants, or as peddlers of snake oil on behalf of governments and other clients, while exploiting our personal data.
“CA Political will equip you with the data and insights necessary to drive your voters to the polls and win your campaign. We offer a proven combination of predictive analytics, behavioral sciences, and data-driven ad tech.”
“With up to 5,000 data points on over 230 million American voters, we build your custom target audience, then use this crucial information to engage, persuade, and motivate them to act.”
And offers to help to: “More effectively engage and persuade voters using specially tailored language and visual ad combinations crafted with insights gleaned from behavioral understandings of your electorate.”
One of our fundamental freedoms, as human beings, is that of owning the decision making regarding our own lives and experiences, including evaluating and deciding our own political preferences. To be responsible for our own thoughts, reflections, intentions and actions is generally felt to be an essential part of what it means to be human.
When David Cameron said that “knowledge of human behaviour” was part of his vision for a “new age of government” I was one of a few who didn’t see behavioural economics as the great breakthrough in social policy-making that it was being hailed as. Even the name ‘behavioural insights team’ suggests secrecy, surveilance and manipulation. It was only a matter of time before libertarian paternalism morphed into authoritarianism, hidden in plain view.
We are being told what our ‘best interests’ are by a small group of powerful people whose interests are that want to stay powerful, despite being dogmatic, self-righteous and wrong. Despite the fact that they need specialists in techniques of persuasion, rather than rational and democratic engagement, to appear credible to the electorate.
It seems that the overarching logic of New Right neoliberalism has led to the privatisation of citizens’ decision making and behaviour and a new form of exploiting the population by misuse of their trust and their personal information.
Also, it seems democracy has been commodified and marketised.
Cambridge Analytica are trying to stop the broadcast of an undercover Channel 4 News report in which its chief executive talks unguardedly about its practices. Channel 4 reporters posed as prospective clients and had a series of meetings with Cambridge Analytica that they secretly filmed — including at least one with Alexander Nix, its chief executive.
Channel 4 declined to comment. Cambridge Analytica’s spokesman also declined to comment on the undercover Channel 4 report. The company is under mounting pressure over how it uses personal data in political and election campaign work. It was banned by Facebook on Friday, which claimed it had violated the social network’s rules by failing to delete Facebook user data collected by an app supposedly for ‘research purposes’.
Facebook is now investigating ties between one of its current employees and Cambridge Analytica. Joseph Chancellor, currently a researcher at Facebook, was a director of Global Science Research, a company that provided data to Cambridge Analytica.
The nature of Chancellor’s role as a director of Global Science Research and his knowledge of Kogan’s data collection practices are not clear. A spokesperson for Cambridge Analytica said “there was no recollection of any interactions or emails with” Chancellor.
Facebook didn’t mention Global Science Research. But Cambridge Analytica said on Saturday that it contracted the company in 2014 to “undertake a large scale research project in the United States.”
Global Science Research was incorporated in May 2014 and listed Kogan and Chancellor as directors, according to UK government records. (The records show that Global Science Research was dissolved in October 2017.)
Channel 4 News went ahead to broadcast the Cambridge Analytica exposé despite the legal threat.
From Channel 4: Revealed: Trump’s election consultants filmed saying they use bribes and sex workers to entrap politicians
Watch Channel 4′s excellent undercover documentary.
Cambridge Analytica questioned on fake news – UK parliament
Revealed: 50 million Facebook profiles harvested for Cambridge Analytica in major data breach
Cambridge Analytica: links to Moscow oil firm and St Petersburg university
More allegations of Tory election fraud, now we need to talk about democracy
The anti-social public relations of the PR industry
The Nudge Unit’s u-turn on benefit sanctions indicates the need for even more lucrative nudge interventions, say nudge theorists
How Covert Agents Infiltrate the Internet to Manipulate, Deceive, and Destroy Reputations – Glenn Greenwald
Controversial GCHQ Unit Engaged in Domestic Law Enforcement, Online Propaganda, Psychology Research – Glenn Greenwald and Andrew Fishman
I don’t make any money from my work. But you can support Politics and Insights and contribute by making a donation which will help me continue to research and write informative, insightful and independent articles, and to provide support to others. The smallest amount is much appreciated, and helps to keep my articles free and accessible to all – thank you.
16 thoughts on “Cambridge Analytica, the commodification of voter decision making and marketisation of democracy”
Reblogged this on sdbast.
Reblogged this on Declaration Of Opinion.
Reblogged this on seachranaidhe1.
This may also be relevant:
A federal judge is ordering Facebook to face trial over its collection of data from personal images after the company asked to have the case dismissed
Yes, in theory this ought to represent a breach of the UK Data Protection Act, and the EU’s General Data Protection Regulations are going to be included in a new Act in May, so all this data harvesting should not be happening, (he says, mindful that GCHQ are doing the same thing).
I’ve been looking through the GDPR and the rules look quite stringent. Anyone fancy sending London-based Cambridge Analytica a data audit form?
Just as well I don’t use Facebook!
Sometimes I feel like I’ve woken up on the set of Patrick McGoohan’s sixties tv series ‘The Prisoner’.
I can’t remember who exactly, but I know at least one whistleblower has suggested that the real reason the UK and USA governments are collecting so much data on their citizens is so that they can control them. Once people know that the State is watching them, they tend not to cause ‘trouble’.
Still, I can buy the idea that behavioural science companies were gaming the electoral system more than I can the idea that the Russian hackers were somehow hacking offline voting machines.
LikeLiked by 1 person
The Wiley disclosure certainly had quite a media make over, he sits in a trendy bare room with a big photo shoot light for the Guardian and in a graffiti tunnel for ITV news, yet with all his intellectual prowess his deductive reasoning interestingly falls short on his employer making a link with Russian oil: “It didn’t make any sense to me,” says Wylie. “I didn’t understand either the email or the pitch presentation we did. Why would a Russian oil company want to target information on American voters?”
The spotlight on this company must be just the tip of the iceberg.
In 2010 I had blogged about the EU intending to make it clear how internet users would have their digital data exploited and the New York Times had a comment re the intended EU overhaul of privacy regulations. I had written that the publishers value was not based on content or brand but on the information that can be collected about each digital visitor, as we click away our preferences and online patterns are being delivered up to the advertising market because the ability to sell this information about us is the true value a publisher holds. Here is the comment in the New York Times (20 Nov 2010) about the E.U´s intention to overhaul the online privacy rules to protect personal data which would hamper the “development of services” – a great euphemism for snooping:
“Rules requiring Internet companies to secure users’ consent upfront could hamper the development of services that align online advertising with Web users’ personal interests, as reflected in the Web sites they visit or the preferences they express in social networks and other online forums. From a marketer’s perspective, this could dilute one of the big advantages of the Web over traditional media.”
Evidently the misuse of data has been understood for many years, (as you have pointed out Sue), I also noted in 2010 a New Scientist article: “EVERY move you make, every twitter feed you update, somebody is watching you. You may not think twice about it, but if you use a social networking site, a cellphone or the internet regularly, you are leaving behind a clear digital trail that describes your behaviour, travel patterns, likes and dislikes, divulges who your friends are and reveals your mood and your opinions. In short, it tells the world an awful lot about you.”
So how did the ‘security services’ miss Cambridge Analytica’s flagrant misuse of data when it has been clearly understood even in the public realm for almost a decade? These supposed revelations at this juncture come at a time when the hype to cold war status is already far too high…
LikeLiked by 1 person
Thanks, Michelle, for your thoughtful and detailed comments, and the information.
Your last paragraph reminds me that Theresa May had initially planned a cyber attack on Russia, as part of her response to the Skripal poisonings. She was talked down by someone from GCHQ, who pointed out that this would most likely escalate unpredictably. What worried me about this at the time was May’s previous unrelated comments last year – and Boris Johnson among others reiterated her – about a nuclear first strike.
The Conservatives have basically said that they would not hesitate to attack, even if we ourselves were not under threat, in ‘some circumstances’. I seem to recall that a serious cyber attack from a hostile state was one such circumstance. This said, cyber attacks in themselves may compromise our nuclear weapons systems.
But your point about ‘why now’ is spot on. That’s why I left the Guardian’s ‘Russia’ link in ‘related’. I’m unsure about it. It does deflect attention from what western so-called democracies are attempting – nothing less than the microsurveilance and micromanagement of citizens’ perceptions, decisions and behaviours. That’s a staggering step towards totalitarianism. And regardless of whether or not ‘behavioural science’ works, the intention behind the use of these psyops strategies in itself is damning, antidemocratic and frightening.
LikeLiked by 1 person
Hi Sue, yes ‘why now’?
It is very worrying; the story board release of the info keeps it better contained to Cambridge Analytica but what about its parent company SCL Group? A tweet caught my eye this morning between Chunky Mark and Lilly Allen, I screen grabbed it on my phone and have uploaded here just in case the tweet gets deleted, (it shows CA’s methodology was approved by the UK Ministry of Defence, the US State dept., Sandia and NATO): https://drive.google.com/file/d/1ZigteeFSC8PTQJsxvUX37DMJWJuHrvv_/view?usp=sharing
Original tweet ref:https://twitter.com/lilyallen/status/975990777724768256
However, it is even less pleasant to look at SCL Group’s company page I screen grabbed this info this morning and tweeted back to Lilly and Chunky Mark:
The screen grab says – SCL Group under their Defense heading state: “We have provided data analytics to Global Combatant Commands (GCC). Theatre Special Operations Commands (TSOC) and partner nations since 2009. Our methodology has been executed in programs commissioned by NATO and Sandia National Labs, and we have developed a complete assessment model for the UK MOD in information Operations and strategic communication projects. This model has been taught at the NATO StratCom Centre of Excellence.”
The Washington Post on SCL Group mentions how they began in the 1990’s with a brief mention of government contracts:
So the ‘security services’ left it to a Guardian correspondent to dig out CA’s manipulations….
Our society desperately needs TLC the other acronyms listed in the SCL blurb are the result of sociopathological leadership!
Reblogged this on disabledsingleparent.
hi Sue, FYI in case you hadn’t seen further cross links with SCL and Emerdata:
View at Medium.com
LikeLiked by 1 person
PS: if you didn’t see this yesterday: “I’ve observed that this niche perception of British expertise in the dark arts can prove helpful for former military contractors when promoting their work in the US. SCL and Cambridge Analytica appear to have similarly extended their reach in politics by using US perceptions of “Britishness”. By way of example, Cambridge Analytica’s former research director, Chris Wylie – who was first to blow the whistle on the company – claimed it had fabricated ties to the University of Cambridge.” Dr Emma Briant on her CA research et al released to parliament:
Please note Emma also discloses that she is part of the ‘Editorial Board for Defence Strategic Communication, an academic journal published by NATO.’
LikeLiked by 1 person