In this article we propose a novel technique for the re-calibration of Machine Learning (ML) models. This technique is based on the computation of confidence intervals for the probability scores provided by any ML model. Compared to existing and commonly used calibration methods, the proposed approach has two important advantages: first, under weak assumptions it provides theoretical guarantees about calibration; second, this method does not require any further data other than the training set used for ML model development. We illustrate the effectiveness of the proposed approach on a benchmark dataset for COVID-19 diagnosis, by comparing the proposed method against commonly used re-calibration techniques.
Campagner, A., Famiglini, L., Cabitza, F. (2022). Re-calibrating Machine Learning Models Using Confidence Interval Bounds. In 19th International Conference on Modeling Decisions for Artificial Intelligence, MDAI 2022 (pp.132-142). Springer Science and Business Media Deutschland GmbH [10.1007/978-3-031-13448-7_11].
Re-calibrating Machine Learning Models Using Confidence Interval Bounds
Campagner A.
;Famiglini L.;Cabitza F.
2022
Abstract
In this article we propose a novel technique for the re-calibration of Machine Learning (ML) models. This technique is based on the computation of confidence intervals for the probability scores provided by any ML model. Compared to existing and commonly used calibration methods, the proposed approach has two important advantages: first, under weak assumptions it provides theoretical guarantees about calibration; second, this method does not require any further data other than the training set used for ML model development. We illustrate the effectiveness of the proposed approach on a benchmark dataset for COVID-19 diagnosis, by comparing the proposed method against commonly used re-calibration techniques.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.