Title: High-dimensional regression with L0 regularization
Authors: Antoine Dedieu - MIT (United States)
Rahul Mazumder - MIT (United States)
Peter Radchenko - University of Sydney (Australia) [presenting]
Abstract: New applications of discrete optimization techniques in high-dimensional regression will be discussed. In particular, we will review the recently proposed mixed integer optimization implementations of the best subset selection estimator and the discrete Dantzig Selector. The latter estimator minimizes the number of nonzero regression coefficients, subject to a budget on the maximal absolute correlation between the features and residuals. It can be expressed as a solution to a mixed integer linear optimization problem, a computationally tractable framework that delivers provably optimal global solutions. The current state of algorithmics in integer optimization makes the estimator highly scalable: it scales gracefully to problems with 10,000 predictors. We will also discuss applications of mixed integer optimization in high-dimensional linear regression with group structure, as well as high-dimensional additive modeling. In addition, we will consider a regularized version of the best subset selector, and investigate its advantages in the low signal regimes.