Apple watch and Fitbits wrongly sending healthy people to doctors could overwhelm NHS, report warns

'Some say AI is going to provide instant relief to many of the pressures healthcare systems, others claim it is little more than snake oil,' says Professor Carrie MacEwen

Alex Matthews-King
Health Correspondent
Monday 28 January 2019 13:16 GMT
Comments
Fitbit insist that their product is “not a medical device"
Fitbit insist that their product is “not a medical device" (Shutterstock)

Waits to see a GP could become even worse because of the rise of health apps Fitbits and devices erroneously telling their users they’re ill, medical experts have warned.

A report from the Academy of Medical Royal Colleges (AoMRC) on the impact of artificial intelligence and tech in the health system warned that waves of “worried well” could overwhelm the health service and cause “harm at scale”.

Doctors from the AOMRC said they have already had patients booking up appointments because their Apple watch or Fitbit had said their heart rate was too fast or slow – when it was actually perfectly normal.

It imagines two future scenarios. One is a “utopian” system where the proliferation of technology reduces health inequalities by ensuring everyone has access to the best care and standards are driven up.

“The dystopian, but also feasible outcome is that health inequalities increase, or the system becomes overwhelmed by ‘the worried well’ who have arrived at their GPs’ surgery or the Emergency Department because they have erroneously been told to attend by their AI enabled Fitbit or smartphone,” the report adds.

“Equally worrying is a world where only the wealthy will be able to access the best AI delivered healthcare as those providers will be the only ones with pockets deep enough to access the best data and develop the best AI.”

(REUTERS)

New features introduced to Apple devices include a system which calls the ambulance service when it detects a fall and its user is unresponsive.

While medical trials are increasingly showing uses for machine learning systems – trained on thousands of x-rays and scans – in spotting conditions like lung cancer or stroke and prioritising them for a human review.

The NHS is also piloting a symptom checker app, developed by tech company Babylon, which can help diagnose patients and tell them which service to use.

The report warns regulators must catch up with this “game changer” technology and clarify whether manufacturers or doctors would be “required to ‘pick up the pieces’ from AI errors or bad advice” that could lead to a missed diagnosis or death.

Support free-thinking journalism and attend Independent events

“Some say AI is going to provide instant relief to many of the pressures healthcare systems across the world are facing, others claim AI is little more than snake oil and can never replace human delivered care,” said Professor Carrie MacEwen, chair of the AoMRC.

“The key theme that leaps from almost every page of this report is the tension between the tech mantra, ‘move fast and break things’ and principle enshrined in the Hippocratic Oath, ‘First, do no harm.’

“This apparent dichotomy is one that must be addressed if we are all to truly benefit from AI.”

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in