Ecological filters used by radiologists to screen clinical information. Credit: The Lancet Digital Health (2024). DOI: 10.1016/S2589-7500(24)00095-5
The use of artificial intelligence in medical diagnostics is on the rise, but new research from the University of Adelaide finds that there are still significant hurdles to overcome when compared to clinicians.
In a paper published in The Lancet Digital Health, Australian Machine Learning Institute PhD student Lana Tikhomirov, Professor Carolyn Semler and a team from the University of Adelaide drew on external research to explore what they call the “AI gap.”
The AI chasm has arisen because the development and commercialization of AI decision-making systems has outpaced understanding of their value to clinicians and their impact on human decision-making.
“This can lead to automation bias (not noticing AI errors) and misuse,” Tikhomirov said. “Misconceptions about AI also limit our ability to get the most out of this new technology and augment humans appropriately.”
“While technology deployment in other high-risk environments, such as increased automation in airplane cockpits, has been explored to understand and improve its use, evaluating AI deployment for clinicians remains a neglected area. AI should be used more like a clinical drug than a device.”
Research has shown that clinicians are situationally motivated and mentally resourceful decision makers, whereas AI models make decisions without understanding the context or correlation between data and patients.
“The clinical environment is rich with sensory cues that may not be noticeable to a novice observer but are used to make a diagnosis,” Tikhomirov said.
“For example, the brightness of a nodule on a mammogram may indicate the presence of a particular type of tumor, and certain symptoms noted on an imaging requisition can affect the sensitivity with which a radiologist can spot features.
“With experience, clinicians learn what cues direct their attention to the most clinically relevant information in their environment.
“The ability to use this domain-relevant information is known as cue exploitation and is a hallmark of expertise that enables clinicians to rapidly extract important features from clinical sites while maintaining high accuracy and guiding subsequent processing and analysis of specific clinical features.
“AI models cannot question datasets in the same way that clinicians are encouraged to question the validity of what they've been taught, a practice called epistemic humility in clinical practice.”
Further information: Lana Tikhomirov et al., “Medical artificial intelligence for clinicians: the missing cognitive perspective,” The Lancet Digital Health (2024). DOI: 10.1016/S2589-7500(24)00095-5
Provided by University of Adelaide
Source: Bridging the gap between AI technology and clinicians (August 29, 2024) Retrieved August 29, 2024 from https://medicalxpress.com/news/2024-08-bridging-chasm-ai-technology-clinicians.html
This document is subject to copyright. It may not be reproduced without written permission, except for fair dealing for the purposes of personal study or research. The content is provided for informational purposes only.