Multispectral immunofluorescence (M-IF) analysis is used to investigate the cellular landscape of tissue sections and spatial interaction of cells. However, complex makeup of markers in the images hinders the accurate quantification of cell phenotypes. We developed DeepMIF, a new deep learning (DL) based tool with a graphical user interface (GUI) to detect and quantify cell phenotypes on M-IF images, and visualize whole slide image (WSI) and cell phenotypes. To identify cell phenotypes, we detected cells on the deconvoluted images followed by co-expression analysis to classify cells expressing single or multiple markers. We trained, tested and validated our model on > 50 k expert single-cell annotations from multiple immune panels on 15 samples of follicular lymphoma patients. Our algorithm obtained a cell classification accuracy and area under the curve (AUC) ≥ 0.98 on an independent validation panel. The cell phenotype identification took on average 27.5 min per WSI, and rendering of the WSI took on average 0.07 minutes. DeepMIF is optimized to run on local computers or high-performance clusters independent of the host platform. These suggest that the DeepMIF is an accurate and efficient tool for the analysis and visualization of M-IF images, leading to the identification of novel prognostic cell phenotypes in tumours.

Hagos, Y., Akarca, A., Ramsay, A., Rossi, R., Pomplun, S., Moioli, A., et al. (2022). DeepMIF: Deep Learning Based Cell Profiling for Multispectral Immunofluorescence Images with Graphical User Interface. In Medical Image Computing and Computer Assisted Intervention – MICCAI 2022 25th International Conference, Singapore, September 18–22, 2022, Proceedings, Part IV (pp.140-149). Springer Science and Business Media Deutschland GmbH [10.1007/978-3-031-16440-8_14].

DeepMIF: Deep Learning Based Cell Profiling for Multispectral Immunofluorescence Images with Graphical User Interface

Gritti G.;
2022

Abstract

Multispectral immunofluorescence (M-IF) analysis is used to investigate the cellular landscape of tissue sections and spatial interaction of cells. However, complex makeup of markers in the images hinders the accurate quantification of cell phenotypes. We developed DeepMIF, a new deep learning (DL) based tool with a graphical user interface (GUI) to detect and quantify cell phenotypes on M-IF images, and visualize whole slide image (WSI) and cell phenotypes. To identify cell phenotypes, we detected cells on the deconvoluted images followed by co-expression analysis to classify cells expressing single or multiple markers. We trained, tested and validated our model on > 50 k expert single-cell annotations from multiple immune panels on 15 samples of follicular lymphoma patients. Our algorithm obtained a cell classification accuracy and area under the curve (AUC) ≥ 0.98 on an independent validation panel. The cell phenotype identification took on average 27.5 min per WSI, and rendering of the WSI took on average 0.07 minutes. DeepMIF is optimized to run on local computers or high-performance clusters independent of the host platform. These suggest that the DeepMIF is an accurate and efficient tool for the analysis and visualization of M-IF images, leading to the identification of novel prognostic cell phenotypes in tumours.
paper
Cell classification; Cell detection; Deep learning; Image viewer; Multispectral immunofluorescence;
English
25th International Conference on Medical Image Computing and Computer-Assisted Intervention, MICCAI 2022 - September 18–22, 2022
2022
Medical Image Computing and Computer Assisted Intervention – MICCAI 2022 25th International Conference, Singapore, September 18–22, 2022, Proceedings, Part IV
9783031164392
2022
13434
140
149
none
Hagos, Y., Akarca, A., Ramsay, A., Rossi, R., Pomplun, S., Moioli, A., et al. (2022). DeepMIF: Deep Learning Based Cell Profiling for Multispectral Immunofluorescence Images with Graphical User Interface. In Medical Image Computing and Computer Assisted Intervention – MICCAI 2022 25th International Conference, Singapore, September 18–22, 2022, Proceedings, Part IV (pp.140-149). Springer Science and Business Media Deutschland GmbH [10.1007/978-3-031-16440-8_14].
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10281/529714
Citazioni
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
Social impact