CMStatistics 2021: Start Registration
View Submission - CMStatistics
Title: Bayesian Wasserstein deconvolution Authors:  Judith Rousseau - University of Oxford (United Kingdom)
Catia Scricciolo - University of Verona (Italy) [presenting]
Abstract: The focus is on the problem of recovering a distribution function from independent replicates additively contaminated with random errors whose distribution is known to the observer who can record only noisy observations. We investigate whether a Bayesian nonparametric approach for modelling the latent distribution may yield inferences with frequentist asymptotic validity under the 1-Wasserstein metric. When the error density is ordinary smooth, we develop an inversion inequality relating the $L^1$-distance between mixtures to the 1-Wasserstein distance between the corresponding mixing distributions. This inequality improves on the existing ones when no assumption on the mixing distribution, except for moment constraints, is postulated and yields information-theoretic optimal rates. In fact, minimax-optimal posterior contraction rates for the mixed densities yield optimal rates for the corresponding mixing distributions. An application of this inversion inequality to the deconvolution problem shows that, when the mixing distribution is Lebesgue absolutely continuous, a careful choice of the prior law acting as an efficient approximation scheme for the sampling density leads to a posterior contraction rate equal, up to a log-factor, to the lower bound estimation rate. The same prior law is shown to also adapt to the regularity level of a mixing density belonging to a Sobolev space, thus leading to a new adaptive estimation method with respect to the Wasserstein loss.