Tag: AI

The Choice Architecture Of Poverty

Special Rapporteur Philip Alston has presented a United Nations Report on Poverty in the UK. The UK Mainstream Media have not really excelled in analysis or presentation of the findings. After almost a decade of Nudge by Press Release, the Guardian has missed the vital message while the BBC has simply recycled old Government Press Releases. The Mainstream Media seem to be shy about embracing the most damning finding of the report.  

In December 2017, Professor Alston carried out a visit to the USA – California, Alabama, Georgia, Puerto Rico, West Virginia, and Washington DC – carrying out the same kind of investigation as has just finished in the UK. The most damning finding of the UN-US Report on Poverty was similar to the most damning finding of the UN-UK Report on Poverty. Had the Guardian excelled in Journalism they might have highlighted that the UN was not simply finding something isolated.

The Guardian and the BBC might not have concluded that the “Government is in denial” because following the implications of the most damning finding is that POVERTY IS A CHOICE 

Both in 2017, in the USA, and in 2018 in the UK, the UN has concluded that poverty is a choice and that Government has made the decision that the only choice on offer is compliance or poverty. The Mainstream Media is failing to follow any kind of analysis that follows the implications of the finding that poverty is a choice and there is no adequate explanation as to why? The notion that poverty is a choice is one that has been foisted onto everybody by the Government since 2010. Welfare Changes have been touted as Reforms which will enable people to choose to lift themselves out of poverty. That choice takes place within the Choice Architecture that has been created by policy.   

In the UN-US Report, Alston states that: 

“ …I heard how thousands of poor people get minor infraction notices which seem to be intentionally designed to quickly explode into unpayable debt, incarceration, and the replenishment of municipal coffers…”  

In the UN-UK Report, Alston similarly finds that:  

One of the key features of Universal Credit involves the imposition of draconian sanctions, even for infringements that seem minor. Endless anecdotal evidence was presented to the Special Rapporteur to illustrate the harsh and arbitrary nature of some of the sanctions, as well as the devastating effects that resulted from being completely shut out of the benefits system for weeks or months at a time. As the system grows older, some penalties will soon be measured in years.”  

The Mainstream Media make no connection between the American Experience and the British Experience. As if there was no connection between US Policy and UK Policy. As if all the shuttling back and forth between Republicans and Conservatives has never had any impact. As if the Minor Infraction Notices are, in no way, related to Benefit Sanctions. There is an almost willing blindness: never stray from the press release.  

The UN Rapporteur was never commissioned to analyse Nudge Theory. The outcome of eight years of Libertarian Paternalism has transformed British Society into something that, the UN recognises, punishes the Poverty it also chooses to deliver. The overwhelming Mainstream Media response has been the Punch and Judy caricature and Poverty Porn Prurience instead of analysis.

How did a Government get to the point where Human Rights are optional or contingent upon being an Employee: this is a question central to the current Welfare Policy which is transforming British Society. It also has an answer that the UN Rapporteur gives: POVERTY IS A CHOICE.  

In putting forward an endless series of press releases and promoting the production of daytime television portraying skivers and strivers the Department of Work and Pensions has been nudging the Mainstream Media into only presenting a narrative where strivers can choose to leave poverty and only skivers would want to avoid that choice. The constant nudging – the well written Press Releases that, frequently, substitute for actual Journalism – has worked. The Government has decided to provide the choice of poverty in a range of ways.  

The Government provision of choices of poverty underline that decisions are placed beyond Claimants in a calculated and cruel manner. The Choice Architecture prevents Claimants from making decisions. Decisions would empower Claimants and also permit innovation. Claimants could determine what is the best course of action. Instead the digital by default process has been used to provide a series of choices without any deviation permitted.

A Claimant who fails to fill in any choice – and fill it in correctly, and fill it in digitally – automatically chooses poverty. Similarly, those who fail to know that choices have been proffered are choosing poverty. The complexity of the choice architecture is overwhelming – even for those engaged in administering it. It is a system that has been designed to deliver poverty – and it has.  

The skills to interact with a State that is being made actively oppositional and digital as the UN-UK Report highlights:   

The reality is that digital assistance has been outsourced to public libraries and civil society organizations. Public libraries are on the frontline of helping the digitally excluded and digitally illiterate who wish to claim their right to Universal Credit.” 

Which is not too distant from the UN-US Report:

Much more attention needs to be given to the ways in which new technology impacts the human rights of the poorest Americans. This inquiry is of relevance to a much wider group since experience shows that the poor are often a testing ground for practices and policies that may then be applied to others. These are some relevant concerns.”  

The truth is, the US and the UK have parallel tracks in overarching Policy objectives: eliminate the State and have the Poor fend for themselves. The emphasis on digital systems as a means to distance Policy Makers from Policy Delivery and to “cut costs” is evident across the US and UK Reports. Pretrial detention has been an area calling for systematic reform in the US for decades. The UN-US Report observes:   

Automated risk assessment tools, take “data about the accused, feed it into a computerized algorithm, and generate a prediction of the statistical probability the person will commit some future misconduct, particularly a new crime or missed court appearance.”

The system will generally indicate whether the risk for the particular defendant, compared to observed outcomes among a population of individuals who share certain characteristics, is ‘high’, ‘moderate’, or ‘low’. Judges maintain discretion, in theory, to ignore the risk score.” 

Which reflects the “automated” nature of the Work Capability Assessment for the Disabled in the UK, previously reported by the UN as being either at risk or actually in the process of grave human rights abuse. In the UN-UK Report the Automated Risk Assessment tools are commented upon:   

But it is clear that more public knowledge about the development and operation of automated systems is necessary. The segmentation of claimants into low, medium and high risk in the benefit system is already happening in contexts such as ‘Risk-based verification.’ Those flagged as ‘higher risk’ are the subject of more intense scrutiny and investigation, often without even being aware of this fact. The presumption of innocence is turned on its head when everyone applying for a benefit is screened for potential wrongdoing in a system of total surveillance. And in the absence of transparency about the existence and workings of automated systems, the rights to contest an adverse decision, and to seek a meaningful remedy, are illusory.”   

Which underlines that the Government of the day – regardless of political inclination – are delivering Policy Objectives without transparency, clarity or even sufficient information to determine what the Policy Objectives are. When policy objectives only become clear through outcomes, there is a clear suspicion that Democracy has been subverted. Which is the general direction the UN-US and UN-UK Reports indicate. There are serious Human Rights failings but also a serious democratic deficit arising from the idea that POVERTY IS A CHOICE.   

The use of Computer Systems is not neutral or innocent. The Special Rapporteur notes that:   

it is worrying that the Data Protection Act 2018 creates a quite significant loophole to the GDPR for government data use and sharing in the context of the Framework for Data Processing by Government.”  

Which is not simplistically that UK Government Departments have “rights” to trawl through personal data but that it is increasingly criminalised for Claimants – more than eight million people – to object to that trawl or to object to the sharing of data with Commercial Contractors. Those same Contractors being Employers and the inevitable consequence of data sharing being to put Claimants at a distinct power and negotation disadvantage when contracts of Employment are considered. Because the UK Government Departments have zero obligation to ensure Claimants get the best possible job. Simply that Claimants flow off the Register.   

Which is how POVERTY IS A CHOICE is being delivered from Government to the People. Interaction with the Department of Work and Pensions has become the single most corrosive interaction with Government that People can have. The design of benefits has become an exercise in delivering the ideological convictions of the Government regardless of the practicality of those convictions. For the Conservative Government, that conviction is that people should be in poverty unless they are Employed. Which ensures the disabled, parents, students, pensioners, entrepreneurs in start-up and Carers are locked into a combative process in which the only exit is to choose poverty.  

The UK Mainstream Media is not really exploring this dimension of the UN Rapporteur’s commentary. It leads to uncomfortable terrain for any Journalist. Not least, the intimate connection between the Republicans in the US and the Conservatives in the UK. The ideological convergence of the Conservatives with the Republicans has delivered a wide range of public policy disasters. The Department of Work and Pensions has been allowed carte blanche to redesign the Welfare State based on the Workfare preferred by the Republicans.

The Nudge Unit has crossed, and recrossed, the Atlantic ensuring that the Conservative’s historic prejudice for “the right to manage” has become inflated. Including all aspects of social existence into contractual relationships between the Government and the People. Dating back to Ronald Reagan’s 1985 “Contract with America” speech where everything was reduced to legislation as contract and society became replaceable with a well ordered business.

The UK Mainstream Media is not really capable of exploring these ideas because, quite simply, to do so is to undermine the interests of their owners. Without any need for coercion, the Government is capable of nudging the Media into endlessly propagating the POVERTY IS A CHOICE agenda.  

Despite the comprehensive nature of the UN Rapporteurs investigations and reporting, there is little about the UN-UK Report that is actually surprising. The connection between the UN-UK and UN-US Reports might well be a surprise to the Media. Realistically, there should be no surprise at all. The Extremists of The Atlantic Bridge, The Heritage Foundation and all the myriad of Far Right Think Tanks since Reagan, have all been promoting the same ideas both sides of the Atlantic. They have all been ensuring that the tools exist for Government to make only once choice possible for the People and that choice is Poverty.  

UN-UK Report  

UN-US Report 

 Picture: Mika Rottenberg, Bowls Balls Souls Holes, Video Installation Rose Art Museum Waltham USA (2104). 

This article was written by Hubert Huzzah.


 

 

Tories propose nudge, big business AI initiative and ‘personal responsibility’ in place of adequate health care funding

Health spending by govt

A breakdown of spending on health care under each government up to 2016. Under the Major government, we saw a post code lottery of health care provision and patients were left for hours on end in hospital corridors. It’s a grim consideration that the Major government spent rather more on health care than the Conservatives in office since 2010.

Earlier this year, the prime minister was warned that patients being treated within the National Health Service are dying prematurely in hospital corridors, in a letter from A&E chiefs outlining “very serious concerns” about patient safety. 

Sixty-eight senior doctors in charge of some of the busiest accident and emergency departments in England and Wales said safety compromises are becoming “intolerable”. 

The letter includes accounts from frontline A&E doctors, one of whom warned 120 patients a day were being treated in corridors because of a lack of space on wards.

The letter said: “The fact remains that the NHS is severely and chronically underfunded. We have insufficient hospital and community beds and staff of all disciplines especially at the front door to cope with our ageing population’s health needs.”

Other issues raised in the letter, first reported in the Health Service Journal, include patients waiting up to 12 hours for a bed after doctors had decided to admit them, with queues of 50 patients waiting in one emergency department. May said that the cancellation of 55,000 appointments  was “part of the plan” for the NHS last winter, but said of her government’s response “nothing is perfect”.

The National Health Service (NHS) faces significant financial problems in many different areas. It is succeeding in treating more patients than in the past, but this rise in public need for health care, and rising costs coupled with very tight budgets, are translating into widespread pressures on the capacity of staff and managers to keep up with past performance and the standards the service sets itself.” 

Lengthening queues for treatment are happening despite the NHS treating more patients. In England, Scotland, Wales and Northern Ireland, the number of episodes of care provided in NHS hospitals has been rising. In England, for example, the number of episodes of care overseen by a hospital consultant has risen 11.4% between 2010/11 and 2015/16. It is just that the rise in the treatment provided is not keeping pace with the even faster rise in the number of people coming forward.

At the same time, EnglandScotland and Wales have all started in different ways to look at reducing the provision of treatments that may be deemed of ‘less benefit’ to patients. That means that some people who would have had treatment on the NHS before may not in future. 

This decade health services have seen some of the lowest spending increases in their history. In England, real annual increases are only around 1% a year.

Real terms spending has also been roughly flat per person since 2010 in WalesScotland and Northern Ireland.

This compares to an average increase of nearly 4% over the history of the NHS reflecting the fact that, as the OBR has found, an aging population, new technology and rising wealth all tend to increase health spending in a country.

Matt Hancock, the demedicalisation of illness and the neoliberal psychosocial model

Matt Hancock, the Secretary of State for Health and Social Care, has called on patients to have greater responsibility for their own health in the launch of new a policy paper entitled Prevention is better than cure which outlines a vision for a “new 21st-century focus on prevention”.

He says he wants to “radically change the focus of health and social care onto prevention”. 

Last month I wrote an article that pre-empted Hancock’s policy paper, published yesterday. I wrote critically about a number of his proposals in Government plans to use your phone and online data to police your lifestyle and predict ‘threats’ to your health.

Hancock has called for an increase in ‘social prescribing’ – referring patients to classes and community groups – in a bid to “shift the balance” away from GPs ‘automatically prescribing drugs for many illnesses.’  socialprescribing

He said in September“The evidence increasingly shows that activities like social clubs, art, ballroom dancing and gardening can be more effective than medicines for some people and I want to see an increase in that sort of social prescribing.” 

In practice, social prescribing means that GPs, nurses and other healthcare practitioners work with patients to identify non-medical opportunities or interventions that will help, improving support and the wider social aspects of their lives. The services that patients can choose from include everything from debt counselling, support groups, allotments and walking clubs, to community cooking classes and one-to-one coaching.

Both evidence and commons sense suggests that social prescribing may be particularly appropriate and beneficial for isolated, marginalised groups. It is a needs-led community provision that supports and enhances psychosocial health and wellbeing. However, Hancock seems to think it may be used as a substitute for medicine. 

The psychosocial approach has already been used to cut the budget for disability welfare support, with some tragic consequences. Now, the same approach in the form of social prescriptions is being proposed to cut the NHS bill. The University of York has already produced research to show that there is little good quality evidence that social prescribing is cost-effective

The Conservative government has made a link between social prescriptions, cost-cutting and (as I deeply suspected) as a mechanism of extending behavioural modification (euphemistically called “nudgingby the government’s team of behavioural economists and decision-making “experts”).

Nesta, who now partly own the government’s Behavioural Insights Team (the Nudge Unit) are of course at the forefront of promoting social prescriptions among medical professionals, firmly linking what is very good idea with very anti-democratic Conservative notions of behaviour change, citizen responsibility and small-state ideology. So, it’s no longer just about helping people to access a wider range of community-based services and support, social prescribing has also places strong emphasis on “encouraging patients to think about how they can take better care of themselves.” 

Nesta may have a whopping ‘cognitive bias’ here. A ‘perverse incentive’. It’s called the ‘profit’ incentive.

The same (bio)psychosocial model has been used to disingenuously trivialise and euphemise serious physical illnesses, implying either a psychosomatic basis or reducing symptoms to nothing more than a presentation of malingering tactics. This ploy has been exploited by medical insurance companies (infamously by Unum Provident in the USA) and government welfare departments keen to limit or deny access to medical, social care and social security payments, and to manufacture ideologically determined outcomes that are not at all in the best interests of patients, invalidating diagnoses, people’s experience and accounts, and the existence of serious medical conditions

Unum was involved in advising the government on making the devastating cuts to disabled people’s support in the UK’s controversial Welfare Reform Bill.

Hancock said in his speech at the International Association of National Public Health Institutes: “Prevention is also about ensuring people take greater responsibility for managing their own health.

“It’s about people choosing to look after themselves better, staying active and stopping smoking. Making better choices by limiting alcohol, sugar, salt and fat.”

Hancock claims it is not about “patronising” patients, “It’s about helping them make better choices, giving them all the support we can, because we know taking the tough decisions is never easy.”

“In the UK, we are spending £97bn of public money on treating disease and only £8bn preventing it across the UK”.

“You don’t have to be an economist to see those numbers don’t stack up.”

public spending

No, the numbers don’t stack up. Approximately 14 billion is spent by the Department of Health on things like public health initiatives (which aim to improve people’s health so they don’t need to use the NHS as often), education, training, and infrastructure (such as IT and building new hospitals). 

The Conservatives said in their 2015 election manifesto they would provide £8 billion in government, and expect another other £22 billion in savings from the NHS. The Nuffield Trust said this still left unanswered questions on funding:

“£8bn is the bare minimum to maintain existing standards of care for a growing and ageing population …

“improving productivity on this scale [£22 billion] would be unprecedented”

The Conservative government followed through on the commitment and then started claiming it was giving £10 billion, providing the NHS what it asked for, and more.

In their 2017 election manifesto, the Conservatives said they would increase NHS spending by at least £8 billion in real terms over the next five years, and increase funding per head of the population for the duration of the parliament.

Last year the think tanks said there would be a £4 billion gap in health spending in 2018/19 alone, but the £1.9 billion provided by the government at the end of last year meant that “around half of the minimum gap we calculated has been filled.”

They said even based on the government’s current spending plans there is likely to be a spending gap of over £20 billion by 2022/23. 

Approximately 44 % of NHS trusts—which provide secondary care to patients who’ve been referred there by a GP—were in the red in 2017/18. The figure was 65% just among acute hospital trusts—which make up the bulk of NHS trusts across England.

Collectively they finished 2017/18 with a deficit of around £960 million.

In this context, social prescriptions are used to maintain the status quo, and are likely to be part of a broader process of responsibility ascription – based on the traditional Conservative maxim of self-help, which is used to prop up fiscal discipline and public funding cuts, the extensive privatisation of public services, defense of private property and privilege, and of course, the free market. The irony of the New Right, neoliberal, paternalistic libertarianism is that the associated policies are not remotely libertarian. They are strongly authoritarian. It’s a government that doesn’t respond to public needs, but rather, it’s one that pre-determines public interests to fit within an ideological framework.

Theresa May has pledged millions of pounds to use Artificial Intelligence (AI) to “improve early diagnosis of cancer and chronic disease.” In a speech delivered earlier this year, May also called for the industry and charities to “join the NHS in creating algorithms that can predict a patient’s care requirements based on their medical records and lifestyle information.” 

The government believes that early intervention would provide “less invasive, more affordable and more successful care than late intervention,” which they claim “often fails.”  

While the government has assumed that the unmatched size of the NHS’s collection of data makes it ideal for implementing AI, many are concerned about data privacy.

Importantly, May’s proposal would (once again) allow commercial firms to access NHS data for profit.

In April 2018, a £1bn AI sector deal between UK Government and industry was announced, including £300million towards AI research. AI is lauded as having the potential to help address important health challenges, such as meeting the care needs of an ageing population. 

Major technology companies – including Google, Microsoft, and IBM – are investing in the development of AI for healthcare and research. The number of AI start-up companies has also been steadily increasing. There are several UK based companies, some of which have been set up in collaboration with UK universities and hospitals.

Partnerships have already been formed between NHS providers and AI developers such as IBM, DeepMind, Babylon Health and Ultromics. Such partnerships have attracted controversy, and wider concerns about AI have been the focus of several inquiries and initiatives within industry, and medical and policy communities. 

Last year, Sir John Bell, a professor of medicine at Oxford university, led government-commissioned review. He said that NHS patient records are uniquely suited for driving the development of powerful algorithms that could “transform healthcare” and seed an “entirely new industry” in profitable AI-based diagnostics. 

Bell describes the recent controversy surrounding the Royal Free hospital in London granting Google DeepMind access to 1.6m patient records as the “canary in the coalmine”.

“I heard that story and thought ‘Hang on a minute, who’s going to profit from that?’” he said. 

Bell gave the hypothetical example of using an anonymised data for chest radiographs to develop an algorithm that eliminated the need for chest x-rays from the ‘analytical pathway’.

“That’s worth a fortune,” he said. “All the value is in the data and the data is owned by the UK taxpayer. There has to be really serious thought about protecting those interests as we go forward.”

However, Bell highlighted a “very urgent” need to review how private companies are given access to NHS data and the ownership of algorithms developed using these records.

Hancock, the recently appointed health secretary, is now planning a “radical” and highly invasive system of “predictive prevention”, in which algorithms will use detailed data on citizens to send targeted “healthy living messages” to those flagged as having “propensities to health problems”, such as taking up smoking or becoming obese. 

People’s medical records will be combined with social and smartphone data to predict who will pick up bad habits and stop them getting ill, under radical government proposals. Of course this betrays a fundamnetal assumption of the government: that illness arises because of  bad “lifestyle choices.” 

In the policy paper released yesterday, Hancock says “Prevention means stopping problems from arising in the first place; focusing on keeping people healthy, not just treating them when they become ill. And if they do, it means supporting them to manage their health earlier and more effectively. 

This means giving people the knowledge, skills and confidence to take full control of their lives and their health and social care, and making healthy choices as easy as possible.” 

And: “Last year, over 20 million people used the NHS website. Over the next ten years, digital services will become even more widespread, and the first point of contact for many. The management of health will move out of clinical settings, and into the hands of people. Devices and applications will provide guidance and support around the clock.”

Hancock also said that Public Health England will look at “harnessing digital technology” as a form of “predictive prevention”, potentially leading to targeted health advice for people based on their their location and lifestyle. 

His focus is on “improving health, reducing demand for public services and supporting economic growth.

And: “Predictive prevention will transform public health by harnessing digital technology and personal data – appropriately safeguarded – to prevent people becoming patients. The availability of public data, combined with the existing understanding of wider determinants of health, means we can use digital tools to better identify risks and then help the behaviours of people most in need – before they become patients. 

“Historically, public health has dealt with populations as a whole – a one-size-fits-all approach. The power of predictive prevention comes from enabling people to look at their health in the context of their own life, their own circumstances, and their own behaviour. “

“This means moving beyond a simply clinical view of a body system or disease. It means envisioning a world where everyone can understand their own risks, both in their genetic make-up and from their personal behaviour. We will be able to empower people to make positive changes – and not always in ways we have traditionally thought about.”

Whenever the Conservatives use words like ’empower’, ‘help’ or ‘support’, I worry, because the ideological context of neoliberalism changes the political meaning of these Orwellian Conservative signpost words. Similarly, when Conservatives use the word ‘sustainability’, it is invariably with a view to making catastrophic funding cuts to social safety net provision and wider public services. 

Hancock says more than once: “The ambition is to prevent people becoming patients.”

Inevitably, he came to : “We want to ensure better integration between health and employment support services to help people with health conditions to enter and stay in work. This means ensuring people receive work-related advice and support within the NHS as part of making work a health outcome; on the basis that good work is good for health.” The basis is unverified.

The government conducted a survey in 2011 that showed if people believed that work is good for them, they are less likely to take time off. There is no evidence that demonstrates work is good for health. There is evidence that suggests people who are well enough to work generally do. It is not possible to ‘make’ work ‘a health outcome’. People are either healthy enough to work or they are not. The fact that healthier people work on the whole does not make work ‘good’ for people’s health. Poverty is historically linked with poor health. In work poverty has risen over the last decade. Having a job is no guarantee of escaping poverty or ill health. 

Job coaches are already asking GPs to refer patients to them, and have even suggested that GPs should make sick notes conditional on patients making an appointment with a work coach. 

Hancock’s proposals hint at a plan to extend conditionality in health care. 

Helen Donovan, Professional Lead for Public Health at the Royal College of Nursing, said: “We welcome the fact that the Health Secretary is making prevention a priority, and clearly recognises that a focus on public health will keep people healthier for longer and save the NHS money and resources in the long run.

“But Matt Hancock must realise his plans will start at a disadvantage as local authorities struggle with planned cuts to public health budgets of almost four per cent per year until 2021. While it’s clear he sees that prevention isn’t an optional extra, we need to see properly funded, accountable services delivered by a fully staffed nursing workforce backed by adequate resources. Disadvantaged areas emerge worse off without these vital services with life expectancy and the poorest bear the brunt of underinvestment in public health.”

Jonathan Ashworth MP, Labour’s Shadow Health and Social Care Secretary, said: “The Tories have imposed swingeing cuts to public health services, slashing vital prevention support such as smoking cessation services, sexual health services, substance misuse services and obesity help.

“In local communities, years of cuts and failed privatisation have resulted in health visitor and school nurse numbers falling, whilst children are losing out on the key early years health interventions they need.

“Many of the aims announced today are laudable but the reality is currently a further £1bn worth of cuts to health services including public health are set to be imposed by this Government next year.”

He added that unless the cuts were reversed, the green paper (planned for next year) would ”be dismissed as a litany of hollow promises”.

Simon Capewell, a professor of public health and policy at Liverpool University, said the minister was right to emphasise the need for effective prevention of epidemics such as obesity, type 2 diabetes and dementia.

But he added: “We must recognise the huge power of our lived environment, and avoid naively just focusing on ‘personal responsibility’ and ‘individual choices’. People do not ‘choose’ obesity or diabetes or cancer. They have just been overwhelmed by a toxic environment.”

The big drop in the last decade in the number of  UK citizens who smoke showed that firm, consistent government action was the best way to boost public health, Capewell said.

He added: “Mr Hancock can celebrate previous health successes with tobacco control. That success was built not on victim blaming, but on strong tax and regulation policies to reduce the ‘three As’ of tobacco affordability, availability and acceptability.”

Ministers need to take similarly tough action now against “the production of the commodities which harm people’s ill health, including junk food, cheap booze and fixed-odds betting terminals,” he said.

Hancock has of course denied that the government’s austerity programme had an impact on public health. In an interview for the BBC’s Today programme Hancock said: “The biggest impact on your health from the economy is whether or not you have got a job, and there are record numbers of jobs in this country.”

This of course is utter rubbish. If Hancock’s magical thinking was true, health in the UK would have dramatically improved over the last few years in line with ‘record employment levels’. But it hasn’t. The Conservatives are a party that prefers dogma over evidence, ideology over public services and the pseudopsychology of nudge over policies that meet public needs. 

Several health organisations have highlighted that local councils in England have had to cut their public health budgets in recent years, and will do so again next year, because Hancock’s department of health and social care has reduced their grants to divert more money to frontline NHS services. Many local councils have delivered preventative health programmes, but are now finding it increasingly difficult to deliver the statutory services. 

Jonathan Ashworth, the shadow health secretary, heavily criticised Hancock’s remarks. “From telling people to stand up in meetings to now lecturing people about their habits, while cutting £1bn from health services, isn’t a serious plan for improving the health of the nation,” he said.

public spending

Over the weekend Theresa May said the Conservatives “are now the natural party of the NHS” and said the Government was putting the public health system, created 70 years ago, on a path to “prosper for another 70 years and more”. That is most certainly an empirically unverified statement. 

It’s utter rubbish.

Matt Hancock Health secretary at his office

Update
Related  

Demedicalising illness and deprofessionalising healthcare: Rogue company Unum’s profiteering hand in the government’s work, health and disability green paper

GPs told to consider making fit notes conditional on patients having appointment with work coach

Government plans to use your phone and online data to police your lifestyle and predict ‘threats’ to your health

Cash for Care: nudging doctors to ration healthcare provision

Rationing and resource gatekeeping in the NHS is the consequence of privatisation

 


My work is unfunded and I don’t make any money from it. This is a pay as you like site. If you wish you can support me by making a one-off donation or a monthly contribution. This will help me continue to research and write independent, insightful and informative articles, and to continue to support others.

DonatenowButton

Government plans to use your phone and online data to police your lifestyle and predict ‘threats’ to your health

“Artificial Intelligence in healthcare is currently geared towards improving patient outcomes, aligning the interests of various stakeholders, and reducing healthcare costs.” CB Insights.

healthcare_AI_map_2016_1

The Care.Data scandal

Back in 2014, public concerns rose because drug and insurance companies were able to buy information about patients – including mental health conditions and diseases such as cancer, as well as smoking and drinking habits – from a single English database of medical data that had been created.

Harvested from GP and hospital records, medical data covering the entire population was uploaded to the repository controlled by an arms-length NHS information centre. Never before had the entire medical history of the nation been digitised and stored in one place. Advocates said that sharing data will make medical advances easier and ultimately save lives because it will allow researchers to investigate drug side effects or the performance of hospital surgical units by tracking the impact of interventions on patients. 

However, data protection campaigners warned at the time that individuals were at risk of being identified and claimed notes were often inaccurate. The Department of Health was also criticised for failing to inform people they would be automatically opted into the scheme and would need to fill in a form if they wanted their medical records removed. 

All 26 million households in England were notified about the Care.data scheme, so that individuals could choose to opt out, but many didn’t know they had that choice. Those behind the £50million data-sharing plan said it would “improve healthcare and help medical research.”

Many doctors were so incensed about the failure to protect patients’ data that they opted out their entire surgeries from the database, and the roll-out was eventually aborted in 2014. Privacy experts warned there will be no way for the public to work out who has their medical records or to what use their data will be put. The extracted information contained NHS numbers, date of birth, postcode, ethnicity and gender.

The controversial £7.5 million NHS database (Care.data) was scrapped very quietly on same day as Chilcot Report was released.

Phil Booth of medConfidential – campaigning for medical data privacy – said: “The toxic brand may have ended, but government policy continues to be the widest sharing of every patient’s most private data.” 

Personal data is now used not only to deliver but to deny services, so it’s more important than ever to check what’s on your records.” 

He goes on to say: “Quite apart from the appalling mistreatment of generations of people, the Windrush scandal highlighted two deep problems about government’s handling of personal data. It confirms the government’s default position is one of disbelief – “guilty until proven innocent”, for some groups at least.

And it also confirms that – despite years of experience of the consequences, the government remains utterly cavalier in its stewardship of your data.

From the Home Office hunting people down through their NHS data and their children’s school records, to Google DeepMind’s secret deal intending to feed 1.6 million Royal Free Hospital patient records to its Artificial Intelligence project to Job Centre bosses interfering in medical records, and the Department for Education packaging up students’ personal data for private exploitation – as many have learned, “the power of data” is not always benign. Whether destroying the Windrush generation’s vital records or losing 25 million people’s records in the post, the consequences of poor information handling practices by Departments of the database state are always damaging to citizens.”

He’s right of course. The principle is one of private profits while the public carry the burden of risks every time. 

There is widespread concern over insurance and marketing companies getting access to our personal health data. The bottom line is that patients must have clear information about what happens to their data, how it may be used, and must be given a clearly stated opportunity to opt out.

The ‘business friendly’ government: deja Vu and AI

Theresa May has again pledged millions of pounds to use Artificial Intelligence (AI) to “improve early diagnosis of cancer and chronic disease.” In a speech delivered earlier this year, May also called for the industry and charities to “join the NHS in creating algorithms that can predict a patient’s care requirements based on their medical records and lifestyle information.” 

The government believes that early intervention would provide “less invasive, more affordable and more successful care than late intervention,” which they claim “often fails.”  

While the government has assumed that the unmatched size of the NHS’s collection of data makes it ideal for implementing AI, many are concerned about data privacy.

Importantly, May’s proposal would once again allow commercial firms to access NHS data for profit.

In April 2018, a £1bn AI sector deal between UK Government and industry was announced, including £300million towards AI research. AI is lauded as having the potential to help address important health challenges, such as meeting the care needs of an ageing population. 

Major technology companies – including Google, Microsoft, and IBM – are investing in the development of AI for healthcare and research. The number of AI start-up companies has also been steadily increasing. There are several UK based companies, some of which have been set up in collaboration with UK universities and hospitals.

Partnerships have been formed between NHS providers and AI developers such as IBM, DeepMind, Babylon Health, and Ultromics. Such partnerships have attracted controversy and wider concerns about AI have been the focus of several inquiries and initiatives within industry, and medical and policy communities. 

Last year, Sir John Bell, a professor of medicine at Oxford university, led a government-commissioned review. He said that NHS patient records are uniquely suited for driving the development of powerful algorithms that could “transform healthcare” and seed an “entirely new industry” in profitable AI-based diagnostics. 

Bell describes the recent controversy surrounding the Royal Free hospital in London granting Google DeepMind access to 1.6m patient records as the “canary in the coalmine”. “I heard that story and thought ‘Hang on a minute, who’s going to profit from that?’” he said.

Bell gave the hypothetical example of using an anonymised data for chest radiographs to develop an algorithm that eliminated the need for chest x-rays from the ‘analytical pathway’.

“That’s worth a fortune,” he said. “All the value is in the data and the data is owned by the UK taxpayer. There has to be really serious thought about protecting those interests as we go forward.”

However, Bell highlighted a “very urgent” need to review how private companies are given access to NHS data and the ownership of algorithms developed using these records.

Matt Hancock, the recently appointed health secretary, is now planning a “radical” and highly invasive system of “predictive prevention”, in which algorithms will use detailed data on citizens to send targeted “healthy living messages” to those flagged as having “propensities to health problems”, such as taking up smoking or becoming obese. 

Despite promises to safeguard data, the plans have already once again attracted privacy concerns among doctors and campaigners, who say that the project risks backfiring by scaring people or damaging public trust in NHS handling of sensitive information. People’s medical records will be combined with social and smartphone data to predict who will pick up bad habits and stop them getting ill, under radical government proposals. Of course this betrays a fundamnetal assumption of the government: that illness arises because of  bad “lifestyle choices.” 

Hancock said: “So far through history public health has essentially dealt with populations as a whole.

“The anti-smoking campaign on TV is targeted at everybody. But using data, both medical data — appropriately safeguarded, of course, for privacy reasons — and using other demographic data, you can work out that somebody might have a higher propensity to smoke and then you can target interventions much more closely.”

However, the historical evidence of the government “safeguarding” our data effectively isn’t particularly confidence-inspiring.

Public Health England is already looking at using demographic and smartphone health data to personalise messages on healthy living and plans to launch pilot projects next year.

Initially the data will be limited to broad categories, such as age or postcode. Ultimately, however, including detailed information on individual housing, employment and income or people’s internet use has not been ruled out. 

Hancock said: “We are now exploring digital services that will use information people choose to share, based on consent with only the highest standards on data privacy, to offer them precise and targeted health advice.”

Advice from whom? Unum and other private insurance companies? Businesses selling life style products? The pharma industry? Many campaigners are very concerned that the use of their data may lead to them being discriminated against by insurers or in the workplace.

Concerns

Another concern is how intrusive surveilance and data analytics is and how it may dehumanise patients. One NHS suicide prevention app, for example, that is currently in development, will monitor emails, texts and social media for signs that “people might be about to kill themselves.”

“Technology now allows us to offer people predictive prevention; tailored, intelligent advice on how to live longer, healthier lives,” Hancock said. 

“This used to happen within the brains of the GPs in the partnership when they really knew the community and had personal relationships with everyone in the community. As GPs’ practices have come under more pressure, that’s become harder and we can use data really effectively to target people who have propensities to health problems.”

Another danger is the ongoing demedicalisation of illness and the deprofessionalisation of trained doctors. Healthcare professionals may feel that their autonomy and authority is threatened if their expertise is challenged by AI. The ethical obligations of healthcare professionals towards individual patients might be affected by the use of AI decision support systems, given these might be guided by other priorities or interests, such as political ideology regarding cost efficiency or wider public health concerns.

AI systems could also have a negative impact on individual autonomy. For example, if they restrict choices based on calculations about risk or what is in the best interests of the user.

If AI systems are used to make a diagnosis or devise a treatment plan, but the healthcare professional is unable to explain how these were arrived at, this could also be seen as restricting the patient’s right to make free, informed decisions about their health. Applications that aim to imitate a human professional raise the possibility that the user will be unable to judge whether they are communicating with a real person or with technology.  

Although AI applications have the broad potential to reduce human bias and error, they can also reflect and reinforce biases in the data used to train them. Concerns have been raised about the potential of AI to lead to discrimination in ways that may be hidden or which may not align with legally protected characteristics, such as gender, ethnicity, disability, and age. 

The House of Lords Select Committee on AI has cautioned that datasets used to train AI systems are often poorly representative of the wider population and, as a result, could make unfair decisions that reflect wider prejudices in society. The Committee also found that biases can be embedded in the algorithms themselves, reflecting the beliefs and prejudices of AI developers.

Sam Smith, of the privacy group medconfidential, warned that a “ham-fisted” plan might backfire, given the government’s poor record on data-handing. He said: “Predictive intervention has to be done carefully and in the right context and with great empathy and care, as it’s easy to just look creepy and end up with a ‘Mark Zuckerberg problem’ [where a focus on the power of data leads to a neglect of the human problems it is trying to solve].”

Humans have attributes that AI systems might not be able to authentically possess, such as compassion and empathy. Clinical practice often involves very complex judgments and abilities that AI currently is unable to replicate, such as contexual knowledge (such as existing comorbidities) and and the ability to read social cues. There is also debate about whether some human knowledge is tacit and cannot be taught.

AI could also be used for malicious purposes. For example, there are fears that AI could be used for covert surveillance or screening by private companies and others. AI technologies that analyse motor behaviour, (such as the way someone types on a keyboard), and mobility patterns detected by tracking smartphones, could reveal information about a person’s health without their knowledge. 

Today it’s reported that NHS Digital is set to ignore the IT security recommendations of its own chief information officer, Will Smart, citing the estimated cost of between £800 million and £1 billion. It claims that the investment would not be “value for money”.

The recommendations were the result of a review, published in February, that was commissioned by government in response to the WannaCry ransomware attack, which affected one-fifth of all NHS trusts in the UK. The NHS was especially hard hit, not least due to a lack of up-to-date patching on Windows 7 workstations across the monolithic organisation, one of the biggest employers in the world.

The recommendations in Smart’s review had been endorsed by the National Cyber Security Centre (NCSC).

However, documents acquired under Freedom of Information by the Health Service Journal (HSJ), indicate that NHS Digital has opposed adoption of the recommendations on the grounds that they would not “be value for money”. 

NHS Digital’s response comes despite the organisation coming under sustained and continual cyber attacks, including one called Orangeworm that specifically targets sensitive healthcare data. HSJ adds that malicious phishing websites mimicking NHS trusts have also been found, while one NHS organisation was found to have exposed a sensitive database online.

A scan by NHS Digital, it adds, found 227 medical devices connected to the internet with a known vulnerability. And four out of five NHS trusts failed to even respond to a ‘high severity’ cyber alert issued in April.

The review of NHS IT security by CIO Will Smart came four months aftea damning report into the state of NHS IT security produced by the National Audit Office, which indicated that the NHS and Department of Health didn’t know how to respond to the outbreak.

With such a cavalier approach to basic IT security, it’s difficult to imagine how we can possibly trust the ‘business-friendly’ government and NHS with the stewardship of our personal health data. Personal data has become the currency by which society does business, but advances in technology should not mean organisations racing ahead of people’s basic rights. Individuals should be the ones in control and government and private organisations alike must do better to demonstrate their accountability to the public.

The use of AI in surgery

DA-VINCI-ROBOT

The Da Vinci surgicalrobot 

An inquest heard recently heard how a patient who underwent ”pioneering’ robotic heart valve surgery at the Freeman hospital in Newcastle died days later after the procedure went horribly wrong. 

Stephen Pettitt died because of multiple organ failure, yet it was expected that he was 98-99% chance of surviving the relatively low risk surgerical procedure.

Heart surgeon Sukumaran Nair had been offered training on the use of the robot with the hospital’s gynaecology department – but he refused.

He told a colleague later he could have done more “dry-run” training beforehand, the hearing heard.

The operation was planned to repair a mitral valve but damage was caused by the robot to the interatrial septum. The procedure had to be converted to an open heart operation where the chest was opened up to repair the tear.

Pathologist Nigel Cooper said: “By that time the operation had been going on for a considerable period of time. By the end of the surgery the heart was functioning very poorly.”

Medicines and a machine to help the heart function were brought in but Pettitt’s organs began to shut down and he didn’t recover.

The robot was so loud in use that the surgical team were shouting at one another and the same machine knocked a theatre nurse and destroyed the patient’s stitches, the Newcastle Coroner’s Court heard.

A leading heart surgeon, Professor David Anderson, told Newcastle Coroner’s Court the operation conducted by under-trained Sukumaran Nair, using the Da Vinci surgical robot, would not likely have ended that way had the robot not been used.

Anderson, a consultant cardiac surgeon at Guy’s and St Thomas’s Hospital, London, told the Newcastle hearing that Pettitt’s euroSCORE – the risk factor applied to heart surgery patients – was just 1-2% in normal circumstances.

Such was the concern at the completely botched six-hour-long procedure performed on Stephen Pettitt, that Northumbria police have launched a criminal inquiry.

Nair was fired from his job at Newcastle’s Freeman Hospital and their robotics heart programme ended. Newcastle coroner Karen Dilks recorded a narrative verdict into the retired music teacher’s death.

According to the US company behind the botched procedure, Intuitive Surgical Inc, “The surgeon is 100% in control of the da Vinci System at all times.”

However, there can be “serious complications and death” in any surgery, according to the business.

Risks during surgery include inadvertent cuts, tears, punctures, burns or injury to organs.

Don’t those stated risks negate the justification for using robotics to perform surgery – increased, not decreased, precision?

The company add: “It is advised the surgeon switch from minimally invasive surgery to open surgery (through a large incision) or hand-assisted surgery if problems occur.”

Related

Artificial intelligence (AI) in healthcare and research

GPrX – the company that sells NHS data to sales teams for pharma industry can market products and target the prescribers 

 


 

I don’t make any money from my work. I’m disabled through illness and on a very low income. But you can make a donation to help me continue to research and write free, informative, insightful and independent articles, and to provide support to others. The smallest amount is much appreciated – thank you.

DonatenowButton