B0924
Title: Semi-implicit variational inference in additive models
Authors: Jens Lichter - University of Goettingen (Germany) [presenting]
Paul Wiemann - The Ohio State University (United States)
Thomas Kneib - University of Goettingen (Germany)
Abstract: For big-scale problems and complex Bayesian regression models, variational inference (VI) offers a computationally efficient way of approximating the posterior distribution when no analytic form exists. Classical VI, however, is often based on the strong mean-field assumption, where, in the approximation, parameters are assumed to be independent of each other. As a consequence, parameter uncertainties are often underestimated. This issue appears, in particular, in regression models with strongly correlated covariates. We propose to use the semi-implicit VI (SIVI) approach in additive models as an inferential method to weaken the mean-field assumption and improve uncertainty estimation. Firstly, SIVI uses a hierarchical construction of the parameters to restore parameter dependencies. Secondly, the mixing distribution on the higher level of the hierarchy does not need to be explicit, meaning a highly flexible implicit distribution represented by a neural network can be chosen. We present results from a simulation study revealing that SIVI accurately estimates parameter uncertainties and can outperform classical VI in additive models. Furthermore, we demonstrate our approach with an application to tree height models on a large-scale forestry data set.