CMStatistics 2018: Start Registration
View Submission - CMStatistics
B0156
Title: Bayesian nonparametric updating of parametric models with Monte Carlo sampling Authors:  Chris Holmes - University of Oxford (United Kingdom) [presenting]
Abstract: Bayesian nonparametric learning of parametric models through the use of suitably randomized objective functions is discussed. Bayesian nonparametric credible regions, analogous to bootstrap confidence intervals, on parameters of objective functions such as likelihoods can exhibit better properties than their parametric counterparts, particularly when the models are wrong. Inference is achieved via parallelizable independent Monte Carlo posterior sampling of parameters, avoiding MCMC and issues such as burn in and chain dependence, and is highly scalable on modern computer architectures. We demonstrate the approach on a number of examples including nonparametric inference for Bayesian logistic regression, variational Bayes (VB), and Bayesian random forests.