CMStatistics 2022: Start Registration
View Submission - CMStatistics
B0318
Title: Black box variational inference with a deterministic objective: Faster, more accurate, and even more black box Authors:  Tamara Broderick - MIT (United States) [presenting]
Abstract: Automatic differentiation variational inference (ADVI) offers fast and easy-to-use posterior approximation in multiple modern probabilistic programming languages. However, its stochastic optimizer lacks clear convergence criteria and requires tuning parameters. Moreover, ADVI inherits the poor uncertainty estimates of mean-field variational Bayes (MFVB). We introduce ``deterministic ADVI'' (D-ADVI) to solve these issues. In particular, we replace the intractable MFVB objective with a Monte-Carlo approximation; subsequently, fixing the Monte Carlo draws allows the use of off-the-shelf deterministic optimization tools. We show that D-ADVI reliably finds good solutions with default settings (unlike ADVI) and is faster and more accurate than ADVI. Moreover, unlike ADVI, D-ADVI is amenable to linear response corrections, yielding more accurate posterior covariance estimates. We demonstrate the benefits of D-ADVI on a variety of real-world problems.