CMStatistics 2021: Start Registration
View Submission - CMStatistics
Title: Deep adaptive design: Amortizing sequential Bayesian experimental design Authors:  Adam Foster - University of Oxford (United Kingdom) [presenting]
Abstract: The conventional approach to sequential Bayesian experimental design is to fit a posterior and optimise a design criterion at each iteration. This is computationally costly, and prevents us from using sequential design in many real-world applications such as online surveys, where we must choose each design in under a second. We will discuss Deep Adaptive Design (DAD), a new method for sequential Bayesian experimental design that does not fit posterior distributions nor optimise the criterion at each iteration of the experiment. Instead, DAD learns a design policy network that takes as input the designs and outcomes from previous iterations, and outputs the next design using a single forward pass. DAD can therefore compute the next design adaptively in milliseconds during a live experiment. The network is trained on millions of simulated experimental trajectories using a contrastive information bound as the training objective. We demonstrate that DAD learns excellent experimental design policies for a number of models, and can even outperform the conventional step-by-step approach whilst being orders of magnitude faster at deployment time.