EcoSta 2022: Start Registration
View Submission - EcoSta2022
A0494
Title: Gibbs posterior distributions: Construction, concentration, and calibration Authors:  Ryan Martin - North Carolina State University (United States) [presenting]
Abstract: Bayesian inference has certain advantages, but a fully (and generally correctly) specified statistical model is needed to realize those. What if the quantity of interest is not naturally described as a ``model parameter''? Then there is no sense in which a specified statistical model could be ``correct'' and, hence, there is a risk of model misspecification bias. To avoid this bias, one can construct a so-called Gibbs posterior that directly targets the quantity of interest, compared to a Bayesian posterior that does so only indirectly through a (possibly misspecified) statistical model and marginalization. First, we will discuss the Gibbs posterior construction; second, we will present asymptotic concentration properties of Gibbs posteriors, with a focus on specific examples; and, finally, we will discuss the need to properly calibrate the Gibbs posterior so that inferences (or predictions) are valid, and present an algorithm that achieves this.