CMStatistics 2022: Start Registration
View Submission - CMStatistics
Title: Going beyond spike-and-slab: Sparse modeling with the L1-ball priors Authors:  Leo Duan - University of Florida (United States) [presenting]
Abstract: The l1-regularization is very popular in high dimensional statistics -- it changes a combinatorial problem of choosing which subset of the parameters are zero, into a simple continuous optimization. Using a continuous prior concentrated near zero, the Bayesian counterparts are successful in quantifying the uncertainty in the variable selection problems; nevertheless, the lack of exact zeros makes it difficult for broader problems such as change-point detection and rank selection. Inspired by the duality of the l1-regularization as a constraint onto an l1-ball, we propose a new prior by projecting a continuous distribution onto the l1-ball. This creates a positive probability on the ball boundary, which contains both continuous elements and exact zeros. Unlike the spike-and-slab prior, this l1-ball projection is continuous and differentiable almost surely, making the posterior estimation amenable to the Hamiltonian Monte Carlo algorithm. We examine the properties, such as the volume change due to the projection, the connection to the combinatorial prior, and the minimax concentration rate in the linear problem. We demonstrate the usefulness of exact zeros that simplify combinatorial problems, such as the change-point detection in time series, the dimension selection of the mixture model and the low-rank-plus-sparse change detection in the medical images.