Title: Scalable Bayesian $p$-generalized probit and logistic regression via coresets
Authors: Zeyu Ding - TU Dortmund (Germany) [presenting]
Abstract: The logit and probit link functions are arguably the two most common choices for binary regression models. Many studies have extended the choice of link functions to avoid possible misspecification and improve the model fit to the data. We introduce the $p$-generalized normal distribution into binary regression in a Bayesian framework. The $p$-generalized normal distribution has received considerable attention due to its flexibility in modeling the tails while generalizing, for instance, over the standard normal distribution where $p=2$ or the Laplace distribution where $p=1$. A scalable maximum likelihood estimation (MLE) method for $p$-generalized probit regression has been developed recently. We extend the estimation from MLE to Bayesian posterior estimates using Markov Chain Monte Carlo (MCMC) sampling for the model parameter $\beta$ and the link function parameter $p$. We use simulated and real-world data to verify the effect of different parameters $p$ on the estimation results and how logistic regression and probit regression can be incorporated into a broader framework. To make our Bayesian methods scalable in the case of large data, we also incorporate coresets as a means of reducing the data before performing the complex and time-consuming MCMC analysis. This allows us to perform very efficient calculations while retaining the original posterior parameter distributions up to little distortions in practice and with theoretical guarantees.