B0815
Title: Optimal compressed sparse regression
Authors: Alexander Munteanu - TU Dortmund (Germany) [presenting]
Abstract: Low-distortion random projections are at the heart of fast data compression algorithms for statistical problems such as linear regression. However, their target size is crucially at least linear in the data dimension d. A natural assumption for high dimensional regression problems is that useful solutions lie in the span of only a relatively small number of, at most, $k$ columns. We show how the sparsity assumption helps to compress data for various regression models to within essentially tight $\Theta( k \log(d) / \epsilon^2 )$ bounds, while preserving the regression loss up to a $(1+\epsilon)$ factor. Similar results were known for the related sparse recovery problem studied in compressed sensing, which surprisingly turns out to be a strictly easier problem allowing for smaller compression size. Finally, we show similar compression bounds for LASSO regression, a popular convex relaxation and heavily used heuristic for sparse regression.