B0401
Title: Tuning-free Stein variational gradient descent
Authors: Christopher Nemeth - Lancaster University (United Kingdom) [presenting]
Abstract: Stein variational gradient descent (SVGD) has become a popular inference technique in statistics and machine learning to sample from intractable distributions. Using a kernelised version of Stein's method, SVGD provides a deterministic sampling algorithm that iteratively transports a set of particles to progressively approximate a given distribution, usually a Bayesian posterior distribution. This is achieved through gradient-based updates constructed to decrease the Kullback-Leibler divergence within a function space optimally. Like many gradient-based algorithms, the efficiency of SVGD is tied to the choice of step-size parameter. We will introduce a tuning-free version of SVGD based on parameter-free convex optimisation and show that this new algorithm is competitive against vanilla SVGD and enjoys many of the same theoretical properties.