Title: Scaling Monte Carlo inference for state-space models
Authors: Alexander Shestopaloff - The Alan Turing Institute (United Kingdom) [presenting]
Abstract: The iterated conditional Sequential Monte Carlo (cSMC) method is a particle MCMC method commonly used for state inference in non-linear, non-Gaussian state-space models. Standard implementations of iterated cSMC provide an efficient way to sample state sequences in low-dimensional state-space models. However, efficiently scaling iterated cSMC methods to perform well in models with a high-dimensional state remains a challenge. One reason for this is the use of a global proposal, without reference to the current state sequence. In high dimensions, such a proposal will typically not be well-matched to the posterior and impede efficient sampling. We will describe a technique to construct efficient proposals in high dimensions that are local relative to the current state sequence. A second obstacle to the scalability of iterated cSMC is not using the entire observed sequence to construct the proposal. We will introduce a principled approach to incorporating all data in the cSMC proposal at time $t$. By considering several examples, we will demonstrate that both strategies improve the performance of iterated cSMC for state sequence sampling in high-dimensional state-space models.