In PET imaging the accuracy of lesions border Identification plays a key role in patients’ diagnosis staging, treatment planning and finally their outcome. Ideally, segmentation algorithms should be the most accurate and automatic as possible, avoiding user misinterpretation and user dependent results. Level Set (LS) algorithms are active contours models were a defined function evolves under the influence of internal and external forces that shrink or dilate it. The segmentation result is usually strictly dependent on parameter values that weight these forces and balance their strength. For that reason, we have developed a LS algorithm based on Mumford-Shah functional where weighting parameters were iteratively updated using information from a GMM based algorithm, leading to a fully automated method. The proposed algorithm (LSGMM) was tested on simulated data divided into three groups: homogeneous and heterogeneous lesions, and lesions with low background contrast or with higher hot spots near them. Results evaluation was done by computing errors in estimated volumes and the Dice similarity index. When applied to homogeneous and heterogeneous lesions, LSGMM was able to delineate object boundaries with the same accuracy as the manually tuned LS, with mean volume errors limited to 2% and Dice higher than 0.85. Our results demonstrate the potential of the proposed algorithm, however, further evaluations should be done in order to test its robustness.

Soffientini, C., DE BERNARDI, E., Baselli, G., El Naqa, I. (2015). GMM guided automated level set algorithm for PET image segmentation. In IFMBE Proceedings - World Congress on Medical Physics and Biomedical Engineering, 2015 (pp.368-371). Springer Verlag [10.1007/978-3-319-19387-8_88].

GMM guided automated level set algorithm for PET image segmentation

DE BERNARDI, ELISABETTA
Secondo
;
2015

Abstract

In PET imaging the accuracy of lesions border Identification plays a key role in patients’ diagnosis staging, treatment planning and finally their outcome. Ideally, segmentation algorithms should be the most accurate and automatic as possible, avoiding user misinterpretation and user dependent results. Level Set (LS) algorithms are active contours models were a defined function evolves under the influence of internal and external forces that shrink or dilate it. The segmentation result is usually strictly dependent on parameter values that weight these forces and balance their strength. For that reason, we have developed a LS algorithm based on Mumford-Shah functional where weighting parameters were iteratively updated using information from a GMM based algorithm, leading to a fully automated method. The proposed algorithm (LSGMM) was tested on simulated data divided into three groups: homogeneous and heterogeneous lesions, and lesions with low background contrast or with higher hot spots near them. Results evaluation was done by computing errors in estimated volumes and the Dice similarity index. When applied to homogeneous and heterogeneous lesions, LSGMM was able to delineate object boundaries with the same accuracy as the manually tuned LS, with mean volume errors limited to 2% and Dice higher than 0.85. Our results demonstrate the potential of the proposed algorithm, however, further evaluations should be done in order to test its robustness.
poster + paper
GMM; Level Set; PET segmentation; Biomedical Engineering; Bioengineering
English
IUPESM World Congress on Medical Physics and Biomedical Engineering, 2015
2015
IFMBE Proceedings - World Congress on Medical Physics and Biomedical Engineering, 2015
9783319193878
2015
51
368
371
http://www.springer.com/series/7403
none
Soffientini, C., DE BERNARDI, E., Baselli, G., El Naqa, I. (2015). GMM guided automated level set algorithm for PET image segmentation. In IFMBE Proceedings - World Congress on Medical Physics and Biomedical Engineering, 2015 (pp.368-371). Springer Verlag [10.1007/978-3-319-19387-8_88].
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10281/132930
Citazioni
  • Scopus 1
  • ???jsp.display-item.citation.isi??? 1
Social impact