Title: Functional variable selection based on RKHS
Authors: Jose Luis Torrecilla - Universidad Autónoma de Madrid (Spain) [presenting]
Jose Berrendero - Universidad Autonoma de Madrid (Spain)
Antonio Cuevas - Autonomous University of Madrid (Spain)
Abstract: Variable selection techniques have become a popular tool for dimension reduction with an easy interpretation. However, we are still far from getting a standard in the functional classification framework. We propose a new functional-motivated variable selection methodology (RK-VS). This method appears as a direct consequence of looking at the functional classification problem from an RKHS (Reproducing Kernel Hilbert Space) point of view. In this context, under a general Gaussian model and a sparsity assumption, the optimal rules turn out to depend on a finite number of variables. These variables can be selected by maximizing the Mahalanobis distance between the finite-dimensional projections of the class means. Our RK-VS method is an iterative approximation to this. This is an easy-to-interpret and fast methodology which allows for easily adding extra information about the model. The empirical performance of RK-VS is extremely good when the considered problems fit the assumed model but it turns out to be also quite robust against partial departures from the hypotheses, typically leading to very good results in general problems.