Accurate delineation of lesions in prostate MRI is crucial for the diagnosis of prostate cancer. Manual segmentation is time-consuming, requires advanced medical expertise, and is subject to inter-operator variability. Automatic lesion segmentation therefore represents a valuable tool to support clinicians by reducing workload, minimizing observer bias, and enabling more consistent image analysis. In this work, we investigated the performance of four deep learning architectures for prostate lesion segmentation: nnU-Net, DenseUNet, SegResUNet, and U-Net. Unlike many existing studies that rely on publicly available data, we constructed a dedicated dataset to better capture real-world variability and challenges. The dataset, comprising T2-weighted (T2W), apparent diffusion coefficient (ADC), and diffusion-weighted imaging (DWI) sequences, was carefully annotated by medical experts to ensure high-quality labels. Training was performed using the full combination of these modalities. Two cohorts were considered based on lesion severity, as defined by PI-RADS (Prostate Imaging–Reporting and Data System) scores: one with only PI-RADS 4–5 lesions (151 patients), and another including PI-RADS 3 cases, totaling 209 patients. Evaluation was conducted both on a patient-by-patient basis and in a consolidated all-patient setting. In the patient-level analysis, nnU-Net achieved the highest Dice similarity coefficient (DSC) of 0.60 when trained on PI-RADS 4–5 lesions, while in the all-patient analysis, DenseUNet attained a DSC of 0.57 on the same dataset. These results are within the range reported in recent prostate lesion segmentation studies, and in some cases are comparable to or exceed those obtained with substantially larger datasets.
Fouladi, S., Darvizeh, F., Gianini, G., Di Meo, R., Di Palma, L., Damiani, E., et al. (2026). Exploring UNet-based models for prostate lesion segmentation from multi-sequence MRI (T2W, ADC, DWI). WORLD WIDE WEB, 29(1) [10.1007/s11280-025-01392-6].
Exploring UNet-based models for prostate lesion segmentation from multi-sequence MRI (T2W, ADC, DWI)
Gianini, GabrieleSecondo
;
2026
Abstract
Accurate delineation of lesions in prostate MRI is crucial for the diagnosis of prostate cancer. Manual segmentation is time-consuming, requires advanced medical expertise, and is subject to inter-operator variability. Automatic lesion segmentation therefore represents a valuable tool to support clinicians by reducing workload, minimizing observer bias, and enabling more consistent image analysis. In this work, we investigated the performance of four deep learning architectures for prostate lesion segmentation: nnU-Net, DenseUNet, SegResUNet, and U-Net. Unlike many existing studies that rely on publicly available data, we constructed a dedicated dataset to better capture real-world variability and challenges. The dataset, comprising T2-weighted (T2W), apparent diffusion coefficient (ADC), and diffusion-weighted imaging (DWI) sequences, was carefully annotated by medical experts to ensure high-quality labels. Training was performed using the full combination of these modalities. Two cohorts were considered based on lesion severity, as defined by PI-RADS (Prostate Imaging–Reporting and Data System) scores: one with only PI-RADS 4–5 lesions (151 patients), and another including PI-RADS 3 cases, totaling 209 patients. Evaluation was conducted both on a patient-by-patient basis and in a consolidated all-patient setting. In the patient-level analysis, nnU-Net achieved the highest Dice similarity coefficient (DSC) of 0.60 when trained on PI-RADS 4–5 lesions, while in the all-patient analysis, DenseUNet attained a DSC of 0.57 on the same dataset. These results are within the range reported in recent prostate lesion segmentation studies, and in some cases are comparable to or exceed those obtained with substantially larger datasets.| File | Dimensione | Formato | |
|---|---|---|---|
|
Fouladi-2026-World Wide Web-VoR.pdf
accesso aperto
Tipologia di allegato:
Publisher’s Version (Version of Record, VoR)
Licenza:
Creative Commons
Dimensione
1.39 MB
Formato
Adobe PDF
|
1.39 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


