Title: Online statistical inference for stochastic optimization
Authors: Xi Chen - New York University (United States)
Zehua Lai - University of Chicago (United States)
He Li - New York University (United States)
Yichen Zhang - Purdue University (United States) [presenting]
Abstract: As stochastic optimization attracts attention for a wide range of applications with complex objective functions, there is an increasing demand for uncertainty quantification of estimated parameters. We investigate the problem of statistical inference for model parameters based on gradient-free stochastic optimization methods that use only function values rather than gradients. We first present central limit theorem results for Polyak-Ruppert-averaging type gradient-free estimators. The asymptotic distribution reflects the trade-off between the rate of convergence and function query complexity. To construct valid confidence intervals based on the obtained asymptotic distribution, we further provide a general gradient-free framework for online covariance estimation and analyze the role of function query complexity in the convergence rate of the covariance estimator. This provides a one-pass computationally efficient procedure for simultaneously obtaining an estimator of model parameters and conducting statistical inference. Finally, we provide numerical experiments to verify our theoretical results and illustrate some extensions of our method for some applications.