CMStatistics 2022: Start Registration
View Submission - CMStatistics
B1464
Title: On the convergence of coordinate ascent variational inference Authors:  Debdeep Pati - Texas A&M University (United States)
Anirban Bhattacharya - Texas AM University (United States) [presenting]
Yun Yang - University of Illinois Urbana-Champaign (United States)
Abstract: As a computational alternative to Markov chain Monte Carlo approaches, variational inference (VI) is becoming more and more popular for approximating intractable posterior distributions in large-scale Bayesian models due to its comparable efficacy and superior efficiency. Some recent works provide theoretical justifications for VI by proving its statistical optimality for parameter estimation under various settings; meanwhile, formal analysis of the algorithmic convergence aspects of VI is still largely lacking. We consider the common coordinate ascent variational inference (CAVI) algorithm for implementing the mean-field (MF) VI of optimizing a Kullback-Leibler divergence objective functional over the space of all factorized distributions. Focusing on the two-block case, we analyze the convergence of CAVI by leveraging the extensive toolbox from functional analysis and optimization. We provide general conditions for certifying global or local exponential convergence of CAVI. As illustrations, we apply the developed theory to a number of examples, and derive explicit problem-dependent upper bounds on the algorithmic contraction rate.