Title: Estimation and prediction via group SLOPE (gSLOPE)
Authors: Damian Brzyski - Indiana University Bloomington (United States) [presenting]
Abstract: The penalized method, group SLOPE (gSLOPE), is presented, which can be used to select entire groups of explanatory variables in classical multiple linear regression model. Such groups could be for example defined as different levels of explanatory factor. Our method could be treated as a generalization of widely known group LASSO. We focus on some theoretical results such as the gSLOPE property of controlling the group false discovery rate (gFDR) under orthogonal case. This property says that in the idealistic case, when all columns in design matrix are orthogonal, we can select tuning parameters such as the control over the expected proportion of falsely discovered groups among all discovered groups (which we define as gFDR) is guaranteed for any predefined level $q\in(0,1)$. The extension to near-orthogonal situation and the algorithm of parameters selection will be also discussed. Moreover, we present the result that our method adapts to unknown sparsity and is asymptotically minimax which means, in some sense, that gSLOPE yields the best possible prediction.