We address the issue of divergent maximum likelihood estimates for logistic regression models by considering a conjugate prior penalty which always produces finite estimates. We show that the proposed method is closely related to the reduced-bias approach of Firth (1993), and that the induced penalized likelihood can be expressed as a genuine binomial likelihood, replacing the original data with pseudo-counts.

Rigon, T., Aliverti, E. (2023). Conjugate priors and bias reduction for logistic regression models. STATISTICS & PROBABILITY LETTERS, 202(November 2023) [10.1016/j.spl.2023.109901].

Conjugate priors and bias reduction for logistic regression models

Rigon, T
Primo
;
2023

Abstract

We address the issue of divergent maximum likelihood estimates for logistic regression models by considering a conjugate prior penalty which always produces finite estimates. We show that the proposed method is closely related to the reduced-bias approach of Firth (1993), and that the induced penalized likelihood can be expressed as a genuine binomial likelihood, replacing the original data with pseudo-counts.
Articolo in rivista - Articolo scientifico
Bias reduction; Boundary estimate; Conjugate prior; Exponential family; Pseudo-counts;
English
11-lug-2023
2023
202
November 2023
109901
none
Rigon, T., Aliverti, E. (2023). Conjugate priors and bias reduction for logistic regression models. STATISTICS & PROBABILITY LETTERS, 202(November 2023) [10.1016/j.spl.2023.109901].
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10281/453703
Citazioni
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
Social impact