CMStatistics 2020: Start Registration
View Submission - CMStatistics
B0295
Title: On lower bounds for the bias-variance trade-off Authors:  Alexis Derumigny - University of Twente (Netherlands) [presenting]
Johannes Schmidt-Hieber - University of Twente (Netherlands)
Abstract: It is a common phenomenon that for high-dimensional and nonparametric statistical models, rate-optimal estimators balance squared bias and variance. Although this balancing is widely observed, little is known whether methods exist that could avoid the trade-off between bias and variance. We propose a general strategy to obtain lower bounds on the variance of an estimator with bias smaller than a prespecified bound. This shows to which extent the bias-variance trade-off is unavoidable and allows the quantification of the loss of performance for methods that do not obey it. The approach is based on some abstract lower bounds for the variance involving the change of expectation with respect to different probability measures as well as information measures such as the Kullback-Leibler or chi-square divergence. In a second part of the article, the abstract lower bounds are applied to several statistical models including the Gaussian white noise model, a boundary estimation problem, the Gaussian sequence model and the high-dimensional linear regression model. For the trade-off between integrated squared bias and integrated variance in the Gaussian white noise model, we propose to combine the general strategy for lower bounds with a reduction technique. This allows us to reduce the original problem to a lower bound on the bias-variance trade-off for estimators with additional symmetry properties in a simpler statistical model.