Title: Memory efficient low-rank approximation from incomplete entries via nonconvex optimization
Authors: Xiaodong Li - UC Davis (United States) [presenting]
Ji Chen - UC Davis (United States)
Abstract: Low-rank approximation is one of the most useful tools in statistics and machine learning. It is used for dimension reduction such as PCA, ICA, CCA, kernel PCA, etc. It is also used for memory efficient computing such as kernel approximation. Due to large sample sizes in machine learning applications, standard low-rank approximation algorithms, such as singular value decomposition, usually cannot be directly applied. Taking kernel PCA for an example, it is intensive in both memory and computation to obtain and store the whole Gram matrix. Inspired by the recent development of nonconvex matrix completion, we propose a general framework of rank-$r$ approximation from a few entries, without any spectral or incoherence assumptions on the underlying matrix. Theoretically, we will present a general result whose corollaries improve or match state-of-the-art results for different setups of matrix completion in the literature. Moreover, we apply our method to randomized kernel PCA to showcase its effectiveness and usefulness.