CMStatistics 2018: Start Registration
View Submission - CMStatistics
Title: Bayesian Wasserstein deconvolution Authors:  Catia Scricciolo - University of Verona (Italy) [presenting]
Abstract: The problem of recovering a distribution function from $n$ i.i.d. observations additively contaminated with random errors whose distribution is known to the observer is considered. Implicit measurements occur quite often and one wishes to undo the errors inflicted on the signal. We investigate whether a nonparametric Bayes approach for modelling the latent distribution may result in valid asymptotic frequentist inferences under the 1-Wasserstein loss, which, originated in the optimal transport literature, has now applications in a broad spectrum of research areas such as statistics, economics, image processing, etc. The approach we adopt uses inversion inequalities relating distances between mixtures to the 1-Wasserstein distance between the corresponding mixing distributions to translate posterior contraction rates for mixtures into rates for mixing distributions. For finite mixtures, when the number of components is known up to an upper bound, Bayesian estimation of the mixing distribution can be performed at the best possible rate $n^{-1/4}$ (up to a log-term). If the mixing distribution is completely unknown, for convolutions with ordinary smooth errors, the rate $n^{-1/8}$ is obtained, up to a log-factor, for the Laplace error distribution. We discuss whether the prior law can be chosen to act as an efficient approximating scheme leading to the lower bound rate $n^{-1/5}$ recently obtained in the literature for a minimum distance estimator.