Title: Automated sensitivity analysis for Bayesian inference via Markov Chain Monte Carlo
Authors: Liana Jacobi - University Melbourne (Australia)
Dan Zhu - Monash University (Australia) [presenting]
Abstract: Bayesian inference relies heavily on numerical Markov chain Monte carlo (MCMC) methods for the estimation of intractable high-dimensional posterior distributions and requires specific inputs. We develop a new general and efficient numerical approach to address important robustness concerns of MCMC analysis with respect to prior input assumptions, a major obstacle to wider acceptance of Bayesian inference, and MCMC algorithm performance (convergence) reflected in dependence to chain starting values. Current input robustness analysis relies heavily on a restrictive and computationally very costly bumping-type approaches based on rerunning the algorithm with a small set of different inputs as well as convergence and efficiency diagnostics based on the autcorrelation of the draws. We introduce a comprehensive input sensitivity analysis based on first order derivatives of MCMC output with respect to the hyper-parameters and starting values to analyse prior robustness and algorithm convergence and efficiency. The approach builds on recent developments in sensitivity analysis of high-dimensional numerical integrals for classical simulation methods using automatic numerical differentiation methods. We introduce a range of new robustness measures to enable researchers to routinely undertake a comprehensive sensitivity analysis of their MCMC results. The methods are implemented for a range of Gibbs samplers and illustrated using both simulated and real data examples.