CMStatistics 2017: Start Registration
View Submission - CMStatistics
Title: Convergence rates for stochastic inverse problems using variational methods Authors:  Benjamin Sprung - University of Göttingen (Germany) [presenting]
Thorsten Hohage - University of Goettingen (Germany)
Abstract: Linear inverse problems with Gaussian white noise are considered. In a Hilbert space setting optimal rates of convergence are well known in the literature for Tikhonov regularization and other regularization methods which can be described by a filtered singular value decomposition. Moreover, optimal rates of convergence, also in non-Hilbert norms, have been shown for methods based on wavelet shrinkage. An Advantage of variational methods is that they do not require any knowledge of the forward operator (such as SVD or wavelet-vaguelette decomposition) and they naturally generalize to nonlinear forward operators, Banach space settings and nonquadratic penalty and data-fidelity terms. For a finitely smoothing operator optimal rates of convergence are demonstrated for Besov penalty terms. Extensions to Poisson data with the Kullback-Leibler divergence as data fidelity term are discussed.