When Artificial Intelligence gets it wrong, who gets the blame?


A fascinating blog by Elizabeth Curtin and Jonathan Tomlin from international law firm Sidley considers where liability for AI diagnosis sits. 

Medtech  is a huge industry and billions of dollars of investment and development have poured into the sector in the last few years. Amongst the big advances has been the rise in Artificial Intelligence  (AI) to diagnose patient conditions.

The Royal Free Hospital in London is currently running a study to see if AI can replace the second radiologist in mammogram assessment – the use of a second pair of eyes has been seen to improve detection rates but it is resource intensive and there is a shortage of radiologists.

All very sensible and a possible solution to the frightening shortage of healthcare professionals in the sector. 

But what if AI gets it wrong? 

According to Curtin and Tomlin; 

“Any litigation that arises from medtech that uses AI – especially AI used as part of a diagnosis or intervention – is likely to be complicated. Medtech often involves a complex chain of actions involving a number of different parties, ranging from medical device manufacturers to programmers to physicians. If AI is blamed for misdiagnosing a patient, it may be attributed to a series of connected events rather than to a single failure. In such circumstances, personal injury plaintiffs may seek remedies against everyone involved in their care. This could potentially include the manufacturer who developed and marketed the AI, but might also include the doctor who input data into the AI or interpreted data coming out of the AI.”

Scary stuff and the problems don’t stop there. In a malpractice suit clinicians are asked to explain the steps they took, what formed their decision making and what they might have done to mitigate risk – not so easy to do if the “clinician” in this instance is an algorithm. 

As Curtin and Tomlin say;

“Even if it is possible to know what data were input into the AI and what the AI’s final output data were, the exact steps taken by the algorithm to reach the output decision may not always be fully retraced. You cannot always ask AI to explain its output in the same way that you can ask a doctor. In some circumstances, there can be an ability to retrace the parameters, but it can be a challenge to determine the basis for the alleged error or ambiguity over outcome.”

So what are we to do?

The first thing that anybody involved in using AI as part of a diagnostic or treatment regime needs to do is ensure that they are complying with any guidelines a regulatory authority might have issued, including certification if it exists. 

Secondly, the doctor must have the final say in providing their medical judgement – The royal Free’s approach in evaluating AI as a part of the screening process would seem to be sensible –  AI is there to support rather than supplant the doctor – presumably where there is a conflict between clinician and AI the clinician’s view will prevail.

Thirdly, data security is paramount. If a hacked digital health product injures a patient, product liability may hinge on whether the medical product manufacturer or the software designer was capable of designing a system that is immune to cybersecurity attacks, and to what extent such a defect is reasonably foreseeable, given the general public’s awareness of cybersecurity issues. 

There is no doubt that the courts will be challenged by these questions in years to come but equally certain is the fact that AI will be part of the diagnostic and interventional tools that clinicians will be employing in years to come.

 

 

News and Articles

VIEW ALL

The Harborne Hospital opens its doors with 32 Medmin Consultants

The Harborne Hospital opened its doors this week and our very own Simon Radley, who is also Medical Director at the HCA facility,  held the first clinic in this brand new £100m, state of the art private hospital.
Read full article

‘Going Private’ is the solution for more impatient patients.

It should come as no surprise that the strain on NHS waiting lists emerged as a significant driver for private healthcare usage, with 34% of those surveyed citing lengthy waiting times as a key motivator.
Read full article

What is PHIN, who owns it, who funds it and what does it do?

The founding of PHIN was a crucial part of the CMA's strategy to increase transparency and competition in the sector. PHIN acts as an independent body to monitor commercial actions in private healthcare
Read full article
© 2025 All Rights Reserved - Designed by Medmin