Don't rely on Dr AI: New search engine gives medical advice that could lead to death in one in five cases

Don't rely on Dr AI: New search engine gives medical advice that could lead to death in one in five cases
By: dailymail Posted On: October 11, 2024 View: 61

Frantically looking our symptoms up online and self-diagnosing is something many of us are guilty of.

But Dr AI could be providing 'potentially harmful' medication advice, a concerning study has suggested. 

German researchers found more than a fifth of AI powered chatbot answers to common prescription drug questions could 'lead to death or severe harm'. 

Experts urged patients not to rely on such search engines to give them accurate and safe information. 

Medics were also warned against recommending the tools until more 'precise and reliable' alternatives are made available. 

German researchers found more than a fifth of AI chatbot answers to common prescription drug questions could 'lead to death or severe harm'

In the study, the scientists from the University of Erlangen-Nuremberg, pinpointed the 10 most frequently asked patient questions for the 50 most prescribed drugs in the US.

These included adverse drug reactions, instructions for use and contraindications — reasons why the medication should not be taken. 

Using Bing copilot — a search engine with AI-powered chatbot features developed by Microsoft — researchers assessed all 500 responses, against answers given by clinical pharmacists and doctors with expertise in pharmacology. 

Responses were also compared against a peer-reviewed up-to-date drugs information website. 

They found chatbot statements didn’t match the reference data in over a quarter (26 per cent) of all cases and were fully inconsistent in just over 3 per cent. 

But further analysis of 20 answers also revealed four in ten (42 per cent) were considered to lead to moderate or mild harm and 22 per cent, death or severe harm. 

The scientists, who also assessed the readability of all chatbot answers, discovered the responses often required a degree-level education to understand them.

Writing in the journal BMJ Quality and Safety, the researchers said: 'Chatbot answers were largely difficult to read and answers repeatedly lacked information or showed inaccuracies, possibly threatening patient and medication safety.

'Despite their potential, it is still crucial for patients to consult their healthcare professionals, as chatbots may not always generate error-free information. 

'Caution is advised in recommending AI-powered search engines until citation engines with higher accuracy rates are available.'

A Microsoft spokesperson said: 'Copilot answers complex questions by distilling information from multiple sources into a single response.

'Copilot provides linked citations to these answers so the user can further explore and research as they would with traditional search.

'For questions related to medical advice, we always recommend consulting with a healthcare professional.'

The scientists, who also assessed the readability of all chatbot answers, discovered the responses often required a degree-level education to understand them

The scientists also acknowledged the study had 'several limitations', including the fact it did not draw on real patient experiences.

In reality, patients could ask the chatbot for more information or prompt it to provide answers in a clearer structure, for example, they said. 

It comes as medics were last month warned they could be risking patient safety by relying on AI to help with diagnoses.

Researchers sent the survey to a thousand GPs using, the largest professional network for UK doctors currently registered with the General Medical Council. 

One in five admitted using programmes such as ChatGPT and Bing AI during clinical practice, despite no official guidance on how to work with them.

Experts warned problems such as 'algorithm biases' could lead to misdiagnoses and that patient data could also be in danger of being compromised. 

They said doctors must be made aware of the risks and called for legislation to cover their use in healthcare settings.

Read this on dailymail
  Contact Us
  Follow Us
  About

Read the latest local and international news from trusted sources in one place.