Title: Predictive model degrees of freedom in linear regression
Authors: Bo Luan - Google (United States) [presenting]
Yoonkyung Lee - Ohio State University (United States)
Yunzhang Zhu - Ohio State University (United States)
Abstract: Overparametrized interpolating models have drawn increasing attention from machine learning. Some recent studies suggest that regularized interpolating models can generalize well. This phenomenon seemingly contradicts the conventional wisdom that interpolation tends to overfit the data and performs poorly on test data. Further, it appears to defy the bias-variance trade-off. As one of the shortcomings of the existing theory, the classical notion of model degrees of freedom fails to explain the intrinsic difference among the interpolating models since it focuses on the estimation of in-sample prediction error. This motivates an alternative measure of model complexity which can differentiate those interpolating models and take different test points into account. In particular, we propose a measure with a proper adjustment based on the squared covariance between the predictions and observations. Our analysis with the least squares method reveals some interesting properties of the measure, which can reconcile the double descent phenomenon with the classical theory. This opens doors to an extended definition of model degrees of freedom in modern predictive settings.