Researchers Consider Caution Radiologists About Over-Reliance on AI in Diagnoses

Researchers Consider Caution Radiologists About Over-Reliance on AI in Diagnoses

A recent study sheds light on a possible threat of overdependency among radiologists on artificial intelligence (AI) tools when using it in the making of medical diagnoses, especially when AI flags certain parts of an X-ray. The research, presented in the journal Radiology, was conducted by a group of U.S. researchers who gathered 220 physicians from multiple specialties to view chest X-rays with guidance from AI-based recommendations.

Physicians, including radiologists and physicians in internal or emergency medicine, were asked to interpret X-rays and allowed to take up or edit suggestions made by the AI. The study aimed to assess how different types of AI explanations—local (focused on specific areas of the image) and global (based on comparisons with past cases)—affected diagnostic accuracy.

The results revealed that local AI explanations, when accurate, improved diagnostic accuracy and reduced interpretation time. Physicians using local explanations achieved a diagnostic accuracy rate of 92.8%, compared to 85.3% for those relying on global explanations. However, when the AI’s advice was inaccurate, the accuracy dropped significantly, with local explanations leading to only 23.6% accuracy and global explanations slightly higher at 26.1%.

Dr. Paul H. Yi, co-author of the study and director of intelligent imaging informatics at St. Jude Children’s Research Hospital, emphasized the importance of designing AI tools thoughtfully. He stressed that while AI has the potential to enhance clinical practice, poorly designed explanations could introduce unintended risks. He also noted an unexpected finding: physicians, both radiologists and non-radiologists, tended to trust local explanations more quickly, even when they were incorrect. This phenomenon, referred to as “automation bias,” reveals a subtle but significant insight into how AI explanations can influence trust and decision-making, often unconsciously.

Dr. Yi suggests that one way to mitigate this bias is through a more structured integration of AI into clinical routines. He pointed out that physicians rely on checklists and patterns developed through years of training to minimize errors. However, the introduction of AI tools could disrupt these routines, and future clinical workflows should adapt to incorporate AI in a way that complements existing practices.

The study urges caution in the use of AI tools in radiology, recommending that AI designs be carefully considered to avoid over-reliance and ensure they enhance rather than undermine diagnostic accuracy.

Share:

Facebook
Twitter
WhatsApp
LinkedIn