Title: Principal component reduction of a nonparametric additive model with variable selection
Authors: Shiyuan He - Renmin University of China (China) [presenting]
Kejun He - Renmin University of China (China)
Abstract: Additive models have been widely used as a flexible nonparametric regression method that can overcome the curse of dimensionality. By using sparsity-inducing penalty for variable selection, several authors have developed methods for fitting additive models when the number of predictors is very large, sometimes even larger than the sample size. However, despite good asymptotic properties, the finite sample performance of existing methods deteriorates considerably when the number of relevant predictors becomes moderately large. We propose to reduce the number of additive component functions to be estimated using principal components. To fit the reduced additive model to the data, we develop a novel algorithm to solve the penalized least squares on a fixed-rank manifold with a sparsity-inducing penalty. Our asymptotic theory shows that the resulting estimator has faster convergence rate than estimating without principal component reduction; and this is true even when the reduced model is only an approximation, provided that the approximation error is small. Moreover, the proposed method is able to consistently identify the relevant predictors. The advantage of the reduced additive model is also illustrated using a simulation study.