CMStatistics 2020: Start Registration
View Submission - CMStatistics
Title: How to reduce dimension with PCA and random projections Authors:  Edgar Dobriban - University of Pennsylvania (United States) [presenting]
Fan Yang - University of Pennsylvania (United States)
Sifan Liu - Stanford University (United States)
David Woodruff - Carnegie Mellon University (United States)
Abstract: The aim is to study how to combine ``data-oblivious'' methods, such as random projections and sketching, and ``data-aware'' methods, such as principal component analysis (PCA) to get the best of both. We study ``sketch and solve'' methods that take a random projection (or sketch) first, and compute PCA after. We compute the performance of several popular sketching methods (random iid projections, random sampling, subsampled Hadamard transform, count sketch, etc) in a general ``signal-plus-noise'' (or spiked) data model. Compared to well-known works, our results (1) give asymptotically exact results, and (2) apply when the signal components are only slightly above the noise, but the projection dimension is non-negligible. We also study stronger signals allowing more general covariance structures. We find that (a) signal strength decreases under projection in a delicate way depending on the structure of the data and the sketching method, (b) orthogonal projections are more accurate, (c) randomization does not hurt too much, due to concentration of measure, (d) count sketch can be improved by a normalization method. The results have implications for statistical learning and data analysis. We also illustrate that the results are highly accurate in simulations and in analyzing empirical data.