CMStatistics 2020: Start Registration
View Submission - CMStatistics
Title: Generalized Bayesian conformal inference Authors:  Federico Ferrari - Duke University (United States) [presenting]
Abstract: A widely touted advantage of Bayesian inference is its ability to provide a broad framework for uncertainty quantification in general settings, including for highly complex data and models. However, in practice, Bayesian predictive intervals often have invalid out-of-sample coverage due to prior and/or model misspecification. Such a lack of `calibration' of predictive distributions calls into question whether credible regions provide an adequate characterization of predictive uncertainty. A new generalized Bayesian framework is proposed for learning a calibrated posterior distribution that minimizes a discrepancy from an initial posterior subject to a conformal constraint encouraging validity of prediction intervals. To our knowledge, this is the first procedure in which conformal adjustments to predictive intervals, also feedback to impact parameter inferences. We propose a framework for inferring the conformal distribution that is highly flexible, easily implemented, and immediately applicable to a wide range of models. Simulation experiments and real data applications illustrate substantial gains relative to non-conformal Bayesian inference.