Title: Sparse linear mixed model selection via streamlined variational Bayes
Authors: Emanuele Degani - University of Padua (Italy) [presenting]
Luca Maestrini - The Australian National University (Australia)
Dorota Toczydlowska - UCL (United Kingdom)
Matt P Wand - University of Technology Sydney (Australia)
Abstract: Variational approximations facilitate fast approximate inference for the parameters of a variety of statistical models. However, for mixed models having a high number of random effects, simple application of standard variational inference principles does not lead to fast approximate inference algorithms, due to the size of model design matrices and inefficient treatment of sparse matrix problems arising from the required approximating density parameters updates. We illustrate how previous streamlined variational inference procedures can be generalized to make fast and accurate inferences for the parameters of linear mixed models with nested random effects and priors for selecting fixed effects. The variational inference algorithms achieve convergence to the same optima of their standard implementations, but with significantly lower computational effort, memory usage, and time, especially for large numbers of random effects. Using simulated and real data examples, we assess the quality of automated procedures for fixed effects selection that only rely upon variational approximations and are free from hyperparameters tuning, and also show high accuracy of the approximations against Markov Chain Monte Carlo.