CMStatistics 2018: Start Registration
View Submission - CMStatistics
B1725
Title: Unconventional regularization for efficient machine learning Authors:  Lorenzo Rosasco - Unige MIT IIT (Italy) [presenting]
Abstract: Regularization is classically designed by penalizing or imposing explicit constraints to an empirical objective function. This approach can be derived from different perspectives and has optimal statistical guarantees. However, it postpones computational considerations to a separate analysis. In large scale scenarios, considering independently statistical and numerical aspects often leads to prohibitive computational requirements. It is then natural to ask whether different regularization principles exist or can be derived to encompass at once both statistical and computational aspects. Several ideas in this direction are presented; showing how procedures typically developed to perform efficient computations can often be seen as a form implicit regularization. We will discuss how iterative optimization of an empirical objective leads to regularization, and analyze the effect of acceleration, preconditioning and stochastic approximations. We will further discuss the regularization effect of sketching/subsampling methods by drawing a connection to classical regularization with projection methods common in applied mathematics. We will show how these forms of implicit regularization can obtain optimal statistical guarantees, with dramatically reduced computational properties.