CMStatistics 2022: Start Registration
View Submission - CMStatistics
B1540
Title: Bayesian inference using synthetic likelihood: Asymptotics and adjustments Authors:  Robert Kohn - University of New South Wales (Australia) [presenting]
David Frazier - Monash University (Australia)
Christopher Drovandi - Queensland University of Technology (Australia)
David Nott - National University of Singapore (Singapore)
Abstract: Implementing Bayesian inference is often computationally challenging in complex models, especially when calculating the likelihood is difficult. Synthetic likelihood is one approach for carrying out inference when the likelihood is intractable, but it is straightforward to simulate from the model. The method constructs an approximate likelihood by taking a vector summary statistic as being multivariate normal, with the unknown mean and covariance estimated by simulation. Previous research demonstrates that the Bayesian implementation of synthetic likelihood can be more computationally efficient than approximate Bayesian computation, a popular likelihood-free method, in the presence of a high-dimensional summary statistic. Three contributions are made. The first shows that if the summary statistics are well-behaved, then the synthetic likelihood posterior is asymptotically normal and yields credible sets with the correct level of coverage. The second compares the computational efficiency of Bayesian synthetic likelihood and approximate Bayesian computation. We show that Bayesian synthetic likelihood is computationally more efficient than approximate Bayesian computation. Based on the asymptotic results, the third proposes using adjusted inference methods when a possibly misspecified form is assumed for the covariance matrix of the synthetic likelihood, such as a diagonal or a factor model, to speed up computation.