A1045
Title: Stochastic regularized block majorization-minimization with weakly convex and multi-convex surrogates
Authors: Hanbaek Lyu - University of Wisconsin - Madison (United States) [presenting]
Abstract: Stochastic majorization-minimization (SMM) is a class of stochastic optimization algorithms, which consists of sampling i.i.d. data points from a fixed data distribution and minimizing recursively defined averaged surrogates of an objective function. The surrogates are required to be strongly convex, and convergence rate analysis was missing, in contrast to the extensive literature on stochastic gradient descent (SGD) methods. In this talk, we extend SMM where surrogates are allowed to be weakly convex or block multi-convex, and the averaged surrogates are approximately minimized with proximal regularization or block-minimized within diminishing radii, respectively. Our general framework gives wider applicability to SMM including online CANDECOMP/PARAFAC (CP) dictionary learning and yields greater computational efficiency especially when the problem dimension is large. Our analysis shows that SMM could be faster than SGD in optimizing for the empirical loss and could match the same optimal rate as SGD for the expected loss. As a corollary, our results provide first convergence rate bounds for various online matrix and tensor decomposition algorithms under a general Markovian data setting.