Title: Prior robustness and convergence analysis for MCMC output based on automated sensitivity computations
Authors: Liana Jacobi - University Melbourne (Australia) [presenting]
Dan Zhu - Monash University (Australia)
Abstract: Bayesian inference relies heavily on numerical Markov chain Monte Carlo (MCMC) methods for the estimation of the typically intractable high-dimensional posterior distributions and requires specific inputs. We introduce a new general and efficient numerical approach to address important robustness concerns of MCMC analysis with respect to prior input assumptions, a major obstacle to wider acceptance of Bayesian inference, including MCMC algorithm performance (convergence) reflected in the dependence on the chain starting values. The approach builds on recent developments in sensitivity analysis of high-dimensional numerical integrals for classical simulation methods using automatic numerical differentiation methods to compute first order derivatives of algorithmic output with respect to all inputs. We introduce a range of new robustness measures based on Jacobian matrices of MCMC output w.r.t. to the two sets of input parameters, prior parameters and chain starting values, to enable researchers to routinely undertake a comprehensive sensitivity analysis of their MCMC results. The methods are implemented for a range of Gibbs samplers and illustrated using both simulated and real data examples. We show how to address issues of discontinuities that arise in the context of common random variable updates in Gibbs algorithms, the Gamma and Wishart updates.