In this article we provide an analysis focusing on clinical use of two deep learning-based automatic detection tools in the field of radiology. The value of these technologies conceived to assist the physicians in the reading of imaging data (like X-rays) is generally assessed by the human-machine performance comparison, which does not take into account the complexity of the interpretation process of radiologists in its social, tacit and emotional dimensions. In this radiological vision work, data which informs the physician about the context surrounding a visible anomaly are essential to the definition of its pathological nature. Likewise, experiential data resulting from the contextual tacit knowledge that regulates professional conduct allows for the assessment of an anomaly according to the radiologist's, and patient's, experience. These data, which remain excluded from artificial intelligence processing, question the gap between the norms incorporated by the machine and those leveraged in the daily work of radiologists. The possibility that automated detection may modify the incorporation or the exercise of tacit knowledge raises questions about the impact of AI technologies on medical work. This article aims to highlight how the standards that emerge from the observation practices of radiologists challenge the automation of their vision work, but also under what conditions AI technologies are considered "objective" and trustworthy by professionals.

Anichini, G., Natali, C., Cabitza, F. (2024). Invisible to Machines: Designing AI that Supports Vision Work in Radiology. COMPUTER SUPPORTED COOPERATIVE WORK [10.1007/s10606-024-09491-0].

Invisible to Machines: Designing AI that Supports Vision Work in Radiology

Natali C.
Secondo
;
Cabitza F.
Ultimo
2024

Abstract

In this article we provide an analysis focusing on clinical use of two deep learning-based automatic detection tools in the field of radiology. The value of these technologies conceived to assist the physicians in the reading of imaging data (like X-rays) is generally assessed by the human-machine performance comparison, which does not take into account the complexity of the interpretation process of radiologists in its social, tacit and emotional dimensions. In this radiological vision work, data which informs the physician about the context surrounding a visible anomaly are essential to the definition of its pathological nature. Likewise, experiential data resulting from the contextual tacit knowledge that regulates professional conduct allows for the assessment of an anomaly according to the radiologist's, and patient's, experience. These data, which remain excluded from artificial intelligence processing, question the gap between the norms incorporated by the machine and those leveraged in the daily work of radiologists. The possibility that automated detection may modify the incorporation or the exercise of tacit knowledge raises questions about the impact of AI technologies on medical work. This article aims to highlight how the standards that emerge from the observation practices of radiologists challenge the automation of their vision work, but also under what conditions AI technologies are considered "objective" and trustworthy by professionals.
Articolo in rivista - Articolo scientifico
Artificial intelligence; Decision making; Decision support; Radiological work;
English
28-mag-2024
2024
none
Anichini, G., Natali, C., Cabitza, F. (2024). Invisible to Machines: Designing AI that Supports Vision Work in Radiology. COMPUTER SUPPORTED COOPERATIVE WORK [10.1007/s10606-024-09491-0].
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10281/487660
Citazioni
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
Social impact