Title: Robust machine learning via median-of-means
Authors: Guillaume Lecue - CNRS and ENSAE (France) [presenting]
Matthieu Lerasle - CNRS (France)
Abstract: Median-of-means (MOM) based procedures have been recently introduced in learning theory. These estimators outperform classical least-squares estimators when data are heavy-tailed and/or are corrupted. None of these procedures can be implemented, which is the major issue of current MOM procedures. We introduce minmax MOM estimators and show that they achieve the same sub-Gaussian deviation bounds as the alternatives, both in small and high-dimensional least-squares regression. In particular, these estimators are efficient under moments assumptions on data that may have been corrupted by a few outliers. Besides these theoretical guarantees, the definition of minmax MOM estimators suggests simple and systematic modifications of standard algorithms used to approximate least-squares estimators and their regularized versions. As a proof of concept, we perform an extensive simulation study of these algorithms for robust versions of the lasso.