Title: Scalable Kernel-based variable selection
Authors: Junhui Wang - City University of Hong Kong (Hong Kong)
Shaogao Lyu - Southwestern University of Finance and Economics (China)
Xin He - Shanghai University of Finance and Economics (China) [presenting]
Abstract: Variable selection is central to sparse modeling, and many methods have been proposed under various model assumptions. We will present a scalable framework for model-free variable selection in reproducing kernel Hilbert space (RKHS) without specifying any restrictive model. As opposed to most existing model-free variable selection methods requiring fixed dimension, the proposed method allows dimension $p$ to diverge with sample size $n$. The proposed method is motivated from the classical hard-threshold variable selection for linear models, but allows for general variable effects. It does not require specification of the underlying model for the response, which is appealing in sparse modeling with a large number of variables. The proposed method can also be adapted to various scenarios with specific model assumptions, including linear models, quadratic models, as well as additive models. The asymptotic estimation and variable selection consistencies of the proposed method are established in all the scenarios. If time permits, the extension of the proposed method beyond mean regression will also be discussed.