Although devices have the potential to collect a large amount of data, experts fear that their increasing use could fragment the traditional doctor–patient relationship

Wearables

The use of artificial intelligence (AI)-based diagnostic and management systems and smartphone-based automated monitoring is allowing technology to insert itself between the doctor and the patient, which could fundamentally alter their relationship, according to a UK expert.

‘There’s no way we can escape these fundamental forces’, said John Wyatt, Emeritus Professor of Neonatal Paediatrics, Ethics, and Perinatology at University College London.

He told the Royal College of Physicians’ Med+ 2021 conference on 27 October 2021, however, that there is ‘always going to be a place’ for a ‘compassionate human expert’.

If physicians ‘do have a role, then one of those roles … is as the concept of the “wise friend”, and rediscovering and re-emphasising the importance of clinical intuition and the art of medicine’.

He added that the ‘expert consultant knows what the rules are and knows when they can be broken, and when they shouldn’t be broken, and that is something that no machine can ever really know’.

Pandemic impact

Professor Wyatt began his presentation by noting that the COVID-19 pandemic has had an ‘enormous impact’ on the way medicine is practiced, and the question therefore is: ‘How do we respond to this, and in particular how do we think about the implications?’

He quoted the example of a GP practice that, at the onset of the pandemic, had switched to a smartphone-based online GP consultation system, which led ‘to all kinds of interesting and challenging issues’.

One was that, for many younger people, communicating with a GP via a smartphone ‘was very similar to being involved in a WhatsApp conversation, and therefore all the normal etiquette and rules [of a consultation were] lost’.

People were incredibly‘casual’ and ‘often very rude’ and ‘informal’, he said, and sent pictures of ‘private body parts’, with the result that the practice was ‘struggling to understand how on earth to practice medicine’ in this context.

Wearables

Professor Wyatt went on to say that another area of technological development with wide-reaching implications is wearable technologies, such as fitness trackers, which have an‘extraordinary ability to monitor different physiological variables’.

He said that an incredible amount of data can also be inferred from the way in which an individual uses a smartphone, a process known as digital phenotyping.

For example, AI systems can determine an individual’s mental state and predict a crisis from the nature and quality of their speech, their intonation, facial expression, and eye movements, the way they move the smartphone and operate the keyboard, their social media engagement, and even their internet search activity.

Professor Wyatt believes that this is ‘just a … taste of what is coming’, as the combination of a smartphone and wearable technology is ‘an extraordinarily powerful tool for monitoring a huge range of physiological variables’.

But, he asked, how is all that data being monitored and stored? There also are privacy and reliability issues, which were thrown into sharp focus by recent revelations over Google’s DeepMind and IBM’s Watson systems.

Trio

All of this begs the question: ‘As the patient increasingly takes control of their own management by using technology, where does the physician fit in the care pathway?’

Professor Wyatt suspected that ‘one of the implications that happens is, instead of the person-to-person relationship—a duo between the physician and the patient, which is what we’re traditionally used to—there now comes a kind-of trio.’

He explained: ‘There’s the patient, there’s the physician, and there’s the machine, and there’s three-way complex relationships going between this trio’, which leads to the question of how ‘we learn to navigate this’.

Professor Wyatt believes that it could lead to the ‘fragmentation of the traditional doctor–patient care pathways and decision processes’.

He continued: ‘There are obviously huge confidentiality issues’, particularly when making telemedicine calls, as someone else may be ‘in the room’ with a patient and there may be the unknown ‘influences’ on how they respond.

Moreover, the absence of physical examination could lead to ‘diagnostic and management errors’ due to an inability to assess individuals accurately, and there are ‘all kinds of problems with remote prescribing’.

Professor Wyatt said that underlying the ‘very rapid’ introduction of AI and digital technologies into healthcare are advances in technology, with hardware and software becoming cheaper and more powerful, and the accumulation of ‘massive health datasets’.

The NHS, he said, ‘is seen as a particularly attractive source of “big data” for commercial companies’, as the ‘bigger the dataset, the more accurate the pattern recognition can become’.

The result is ‘massive’ investment from ‘all the major players’ and many commercial start-ups, all hoping to tap into the ‘absolutely astronomical’ potential profits.

The NHS is also seen as ‘ripe for automation’ because of its ‘very high human staffing levels, inefficient practices, old-fashioned technology, low productivity, and so on’.

Rapid diagnosis

Professor Wyatt said that the drivers of automation are increased speed, accuracy, and economic efficiency, via the rapid scaling up and reproduction of effective technologies across health systems.

The aim is to achieve more accurate and rapid diagnosis and image analysis, particularly in the realm of rare diseases and unusual presentations, and the better prediction and monitoring of treatment responses, alongside the ‘democratisation’ of expert knowledge into ‘resource-poor’ settings.

But this could come at the cost of bias, errors, and hidden discriminations in pre-existing health databases.

In addition, the commercial interests raise concerns that ‘most of these healthcare algorithms are actually proprietary and protected by patents and nondisclosure agreements … and, therefore, it’s often not possible to … interrogate the algorithm’.

There are also questions such as how to ‘maintain human agency and oversight’ of automated systems, and how to adjudicate when automated systems ‘suggest actions that conflict with clinical experience’.

‘And one of the questions about all automated and particularly machine learning systems is: when they fail, do they fail in a safe way?’ Professor Wyatt asked.

Intangibles

Following his presentation, session Chair Sonia Panchal, Consultant Rheumatologist at the South Warwickshire NHS Foundation Trust, asked what challenges are faced today by clinicians related to health technologies in the wake of the COVID-19 pandemic.

Professor Wyatt replied that there are currently ‘enormous pressures on the health services, and particularly the backlog … and I think that, increasingly, health managers will be looking at these questions of efficiency and scaling … and saying: how can we get greater output out of the existing workforce?’

‘And the obvious answer is technology … to increasingly take the load off of human beings and put [it] into automated systems.’

This will create huge pressure to introduce such systems, and his worry is that clinicians’ jobs will be looked at and someone will say: ‘You don’t need to do that, you don’t need to do this, you just concentrate on the others and we’ll strip away these other tasks.’

Dr Panchal added: ‘Yet it’s those tasks sometimes you need, to be able to do the tasks you are actually there to do.’

‘It’s a holistic thing’, Professor Wyatt said, adding that ‘technological thinking always focuses on measurable outcomes … and the problem, of course, is these utterly intangible but important things about comfort and care and compassion’ cannot be measured or have their importance demonstrated.

Med+ 2021 session: Ethical and moral dilemmas for physicians in the digital age. Presented 27 October 2021.

This article originally appeared on Medscape, part of the Medscape Professional Network.

Credit:

Lead image: Syda Productions/stock.adobe.com

Image 1: Syda Productions/stock.adobe.com