This paper uses a decision theoretic approach for updating a probability measure representing beliefs about an unknown parameter. A cumulative loss function is considered, which is the sum of two terms: one depends on the prior belief and the other one on further information obtained about the parameter. Such information is thus converted to a probability measure and the key to this process is shown to be the Kullback–Leibler divergence. The Bayesian approach can be derived as a natural special case. Some illustrations are presented.
Bissiri, P., Walker, S. (2012). Converting information into probability measures with the Kullback-Leibler divergence. ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS, 64, 1139-1160 [10.1007/s10463-012-0350-4].
Converting information into probability measures with the Kullback-Leibler divergence
BISSIRI, PIER GIOVANNI
;
2012
Abstract
This paper uses a decision theoretic approach for updating a probability measure representing beliefs about an unknown parameter. A cumulative loss function is considered, which is the sum of two terms: one depends on the prior belief and the other one on further information obtained about the parameter. Such information is thus converted to a probability measure and the key to this process is shown to be the Kullback–Leibler divergence. The Bayesian approach can be derived as a natural special case. Some illustrations are presented.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.