Title: V-fold cross-validation improved for nonparametric regression
Authors: Amandine Dubois - CREST-ENSAI (France) [presenting]
Adrien Saumard - Crest-Ensai (France)
Abstract: The framework where the properties of model selection procedures are best theoretically understood is that of estimating a function, such as a regression function or a density, by minimizing the risk on finite-dimensional models, corresponding to developments on functional bases. In this case, the hyper-parameter which needs to be tuned is the dimension of the models that are considered. The methods commonly used are based on the unbiased risk estimation principle. The basis is the idea that the validity of this principle is essentially asymptotic. In the least squares regression framework, a modification of the V-fold penalty will be proposed, that surpasses the limits of the unbiased risk estimation principle. In a nutshell, it is more efficient to estimate a quantile of the risk of the estimators rather than its mean. An experimental study will highlight the performances of this procedure in comparison with classical V-fold cross-validation.