Timezone: »
Stochastic optimization algorithms such as SGDs update the model sequentially with cheap per-iteration costs, making them amenable for large-scale data analysis. However, most of the existing studies focus on the classification accuracy which can not be directly applied to the important problems of maximizing the Area under the ROC curve (AUC) in imbalanced classification and bipartite ranking. In this paper, we develop a novel stochastic proximal algorithm for AUC maximization which is referred to as SPAM. Compared with the previous literature, our algorithm SPAM applies to a non-smooth penalty function, and achieves a convergence rate of O(log t/t) for strongly convex functions while both space and per-iteration costs are of one datum.
Author Information
Michael Natole Jr (University at Albany)
Yiming Ying (SUNY Albany)
Siwei Lyu (University at Albany, State University of New York)
Related Events (a corresponding poster, oral, or spotlight)
-
2018 Oral: Stochastic Proximal Algorithms for AUC Maximization »
Thu Jul 12th 02:50 -- 03:00 PM Room A5
More from the Same Authors
-
2020 Poster: Fine-Grained Analysis of Stability and Generalization for Stochastic Gradient Descent »
Yunwen Lei · Yiming Ying -
2019 Poster: Stochastic Iterative Hard Thresholding for Graph-structured Sparsity Optimization »
Baojian Zhou · Feng Chen · Yiming Ying -
2019 Oral: Stochastic Iterative Hard Thresholding for Graph-structured Sparsity Optimization »
Baojian Zhou · Feng Chen · Yiming Ying