Although deep learning-based AI systems for diagnostic imaging tasks have virtually showed superhuman accuracy, their use in medical settings has been questioned due to their “black box”, not interpretable nature. To address this shortcoming, several methods have been proposed to make AI eXplainable (XAI), including Pixel Attribution Methods; however, it is still unclear whether these methods are actually effective in “opening” the black-box and improving diagnosis, particularly in tasks where pathological conditions are difficult to detect. In this study, we focus on the detection of thoraco-lumbar fractures from X-rays with the goal of assessing the impact of PAMs on diagnostic decision making by addressing two separate research questions: first, whether activation maps (as an instance of PAM) were perceived as useful in the aforementioned task; and, second, whether maps were also capable to reduce the diagnostic error rate. We show that, even though AMs were not considered significantly useful by physicians, the image readers found high value in the maps in relation to other perceptual dimensions (i.e., pertinency, coherence) and, most importantly, their accuracy significantly improved when given XAI support in a pilot study involving 7 doctors in the interpretation of a small, but carefully chosen, set of images.

Cabitza, F., Campagner, A., Famiglini, L., Gallazzi, E., La Maida, G. (2022). Color Shadows (Part I): Exploratory Usability Evaluation of Activation Maps in Radiological Machine Learning. In Machine Learning and Knowledge Extraction - 6th IFIP TC 5, TC 12, WG 8.4, WG 8.9, WG 12.9 International Cross-Domain Conference, CD-MAKE 2022, Vienna, Austria, August 23–26, 2022, Proceedings (pp.31-50). Springer Science and Business Media Deutschland GmbH [10.1007/978-3-031-14463-9_3].

Color Shadows (Part I): Exploratory Usability Evaluation of Activation Maps in Radiological Machine Learning

Cabitza F.
;
Campagner A.;Famiglini L.;
2022

Abstract

Although deep learning-based AI systems for diagnostic imaging tasks have virtually showed superhuman accuracy, their use in medical settings has been questioned due to their “black box”, not interpretable nature. To address this shortcoming, several methods have been proposed to make AI eXplainable (XAI), including Pixel Attribution Methods; however, it is still unclear whether these methods are actually effective in “opening” the black-box and improving diagnosis, particularly in tasks where pathological conditions are difficult to detect. In this study, we focus on the detection of thoraco-lumbar fractures from X-rays with the goal of assessing the impact of PAMs on diagnostic decision making by addressing two separate research questions: first, whether activation maps (as an instance of PAM) were perceived as useful in the aforementioned task; and, second, whether maps were also capable to reduce the diagnostic error rate. We show that, even though AMs were not considered significantly useful by physicians, the image readers found high value in the maps in relation to other perceptual dimensions (i.e., pertinency, coherence) and, most importantly, their accuracy significantly improved when given XAI support in a pilot study involving 7 doctors in the interpretation of a small, but carefully chosen, set of images.
No
paper
Activation maps; eXplainable AI; Medical machine learning; Thoracolumbar fractures; X-rays;
English
6th IFIP TC 5, TC 12, WG 8.4, WG 8.9, WG 12.9 International Cross-Domain Conference for Machine Learning and Knowledge Extraction, CD-MAKE 2022, held in conjunction with the 17th International Conference on Availability, Reliability and Security, ARES 2022 - 23 August 2022 through 26 August 2022
9783031144622
Cabitza, F., Campagner, A., Famiglini, L., Gallazzi, E., La Maida, G. (2022). Color Shadows (Part I): Exploratory Usability Evaluation of Activation Maps in Radiological Machine Learning. In Machine Learning and Knowledge Extraction - 6th IFIP TC 5, TC 12, WG 8.4, WG 8.9, WG 12.9 International Cross-Domain Conference, CD-MAKE 2022, Vienna, Austria, August 23–26, 2022, Proceedings (pp.31-50). Springer Science and Business Media Deutschland GmbH [10.1007/978-3-031-14463-9_3].
Cabitza, F; Campagner, A; Famiglini, L; Gallazzi, E; La Maida, G
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10281/394394
Citazioni
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
Social impact