Title: Isotonic regression meets LASSO
Authors: Matey Neykov - Carnegie Mellon University (United States) [presenting]
Abstract: A two-step procedure is considered for monotone increasing and smooth additive single index models with Gaussian designs. The proposed procedure is simple, easy to implement with existing software, and consists of consecutively applying LASSO and isotonic regression. Aside from formalizing this procedure, we provide theoretical guarantees regarding its performance: 1) we show that our procedure controls the in-sample squared error; 2) we demonstrate that one can use the procedure for predicting new observations, by showing that the absolute prediction error can be controlled with high-probability. Our bounds show a tradeoff of two rates: the minimax rate for estimating the high dimensional quadratic loss, and the minimax nonparametric rate for estimating a monotone increasing function. Time permitting we may also consider applying the same procedure to binary single index models with Gaussian design.