Title: Stochastic optimization for AUC maximization in machine learning
Authors: Yiming Ying - State University of New York at Albany (United States) [presenting]
Abstract: Stochastic optimization algorithms such as stochastic gradient descent (SGD) update the model sequentially with cheap per-iteration costs, making them amenable for large-scale streaming data analysis. However, most of the existing studies focus on the classification accuracy which can not be directly applied to the important problems of maximizing the Area under the ROC curve (AUC) in imbalanced classification and bipartite ranking. We will present recent work on developing novel SGD-type algorithms for AUC maximization. The new algorithms can allow general loss functions and penalty terms which are achieved through the innovative interactions between machine learning and applied mathematics. Compared with the previous literature which requires high storage and per-iteration costs, our algorithms have both space and per-iteration costs of one datum while achieving optimal convergence rates.