B1267
Title: Adaptive sketching methods
Authors: Mert Pilanci - Stanford University (United States) [presenting]
Abstract: Highly efficient randomized solvers are considered for the least-squares problem and a general class of non-linear optimization problems. We show that the projection dimension can be reduced to the effective dimension of the problem, while preserving high-probability convergence guarantees. In this regard, we derive sharp matrix deviation inequalities over ellipsoids for both Gaussian and SRHT embeddings. Specifically, we improve on the constant of a classical Gaussian concentration bound whereas, for SRHT embeddings, our deviation inequality involves a novel technical approach. Leveraging these bounds, we are able to design a practical and adaptive algorithm that does not require knowing the effective dimension beforehand. Our method starts with an initial embedding dimension equal to 1 and, over iterations, increases the embedding dimension up to the effective one at most. Hence, our algorithm improves the state-of-the-art computational complexity for solving regularized least-squares problems and Newton systems in nonlinear optimization. Further, we show numerically that it outperforms standard iterative solvers such as the conjugate gradient method and its pre-conditioned version on several standard machine learning datasets.