Title: Optimization in high dimensional additive models
Authors: Noah Simon - UW Biostatistics (United States) [presenting]
Abstract: A general framework for sparse additive regression is discussed. We allow the class of each additive component to be quite general (characterized by semi-norm smoothness this includes monotonicity of derivatives, variation/Sobolev smoothness). We show that by minimizing a simple convex problem, we can estimate these functions at the minimax rate (over functions in that smooth additive class). In addition we show that the penalized regression problem can be efficiently solved using a proximal gradient descent algorithm: Each prox-step decouples into $p$-univariate penalized regression problems; each of these univariate penalized problems can in turn be written as a simple update of the solution to a univariate non-parametric regression problem (for which we often have efficient algorithms). In addition, we characterize the statistical performance of the output of our algorithm after a finite number of steps.