Smartphone apps may help promoting the early diagnosis of melanoma. The reliability of specialist judgment on lesions should be assessed. Hereby, we evaluated the agreement of six young dermatologists, after a specific training. Clinical judgment was evaluated during two online sessions, one month apart, on a series of 45 pigmentary lesions. Lesions were classified as highly suspicious, suspicious, non suspicious or not assessable. Cohen’s and Fleiss’ kappa were used to calculate intra- and inter-rater agreement. The overall intra-rater agreement was 0.42 (95% confidence interval - CI: 0.33-0.50), varying between 0.12-0.59 on single raters. The inter-rater agreement during the first phase was 0.29 (95% CI: 0.24-0.34). When considering the agreement for each category of judgment, kappa varied from 0.19 for not assessable to 0.48 for highly suspected lesions. Similar results were obtained in the second exercise. The study showed a less than satisfactory agreement among young dermatologists. Our data point to the need for improving the reliability of the clinical diagnoses of melanoma especially when assessing small lesions and when dealing with thin melanomas at a population level.

Cazzaniga, S., De Ponti, L., Baratelli, G., Francione, S., La Vecchia, C., Di Landro, A., et al. (2022). Agreement on classification of clinical photographs of pigmentary lesions: Exercise after a training course with young dermatologists. DERMATOLOGY REPORTS [10.4081/dr.2022.9500].

Agreement on classification of clinical photographs of pigmentary lesions: Exercise after a training course with young dermatologists

Carugno, Andrea;
2022

Abstract

Smartphone apps may help promoting the early diagnosis of melanoma. The reliability of specialist judgment on lesions should be assessed. Hereby, we evaluated the agreement of six young dermatologists, after a specific training. Clinical judgment was evaluated during two online sessions, one month apart, on a series of 45 pigmentary lesions. Lesions were classified as highly suspicious, suspicious, non suspicious or not assessable. Cohen’s and Fleiss’ kappa were used to calculate intra- and inter-rater agreement. The overall intra-rater agreement was 0.42 (95% confidence interval - CI: 0.33-0.50), varying between 0.12-0.59 on single raters. The inter-rater agreement during the first phase was 0.29 (95% CI: 0.24-0.34). When considering the agreement for each category of judgment, kappa varied from 0.19 for not assessable to 0.48 for highly suspected lesions. Similar results were obtained in the second exercise. The study showed a less than satisfactory agreement among young dermatologists. Our data point to the need for improving the reliability of the clinical diagnoses of melanoma especially when assessing small lesions and when dealing with thin melanomas at a population level.
Si
Articolo in rivista - Articolo scientifico
Scientifica
teledermatology, skin cancer, melanoma, classification, agreement
English
Cazzaniga, S., De Ponti, L., Baratelli, G., Francione, S., La Vecchia, C., Di Landro, A., et al. (2022). Agreement on classification of clinical photographs of pigmentary lesions: Exercise after a training course with young dermatologists. DERMATOLOGY REPORTS [10.4081/dr.2022.9500].
Cazzaniga, S; De Ponti, L; Baratelli, G; Francione, S; La Vecchia, C; Di Landro, A; Carugno, A; Di Mercurio, M; Germi, L; Trevisan, G; Fenaroli, M; Capasso, C; Pezza, M; Dri, P; Castelli, E; Naldi, L
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: http://hdl.handle.net/10281/390752
Citazioni
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
Social impact