Title: Sparse selection and prediction with high-dimensional categorical data
Authors: Wojciech Rejchel - University of Warsaw (Poland) [presenting]
Piotr Pokarowski - University of Warsaw (Poland)
Szymon Nowakowski - University of Warsaw (Poland)
Abstract: Sparse prediction with categorical data is challenging even for a moderate number of variables, because one parameter is roughly needed to encode one category or level. The group lasso is a well known and efficient algorithm for the selection of continuous or categorical variables, but all estimates related to a selected factor usually differ, so a fitted model may not be sparse. To make a group lasso solution sparse, we propose to merge levels of the selected factor, if a difference between its corresponding estimates is less than some predetermined threshold. We prove that under weak conditions our algorithm recovers the true, sparse linear or logistic model even for the high-dimensional scenario, that is when a number of parameters is greater than a learning sample size. To our knowledge, selection consistency has been proven many times for different algorithms fitting sparse models with categorical variables, but our result is the first for the high-dimensional scenario. Numerical experiments show the satisfactory performance of the method.