Title: Regularization parameter selection for sparse methods via AIC
Authors: Yoshiyuki Ninomiya - Kyushu University (Japan) [presenting]
Abstract: Generally sparse methods contain a regularization parameter that determines the result, and several information criteria have been proposed for its selection. While any of them would assure consistency in model selection, we have no appropriate rule to choose between the different possible criteria. On the other hand, a finite correction to the AIC has been provided in a Gaussian linear regression setting for the LASSO. The finite correction is theoretically assured from the viewpoint not of the consistency but of minimizing the prediction error, and it does not have the above-mentioned difficulty in the choice. In general, however, the finite correction cannot be obtained in the case of generalized linear models, and so we derive a criterion from the original definition of the AIC, that is, an asymptotically unbiased estimator of the Kullback-Leibler divergence. Our criterion can be easily obtained and requires fewer computational tasks than does cross-validation, but its performance is almost the same as or superior to that of cross-validation.