Skip to yearly menu bar Skip to main content


Poster

Delving into the Convergence of Generalized Smooth Minimax Optimization

Wenhan Xian · Ziyi Chen · Heng Huang

Hall C 4-9 #1208
[ ] [ Paper PDF ]
Tue 23 Jul 2:30 a.m. PDT — 4 a.m. PDT

Abstract:

Minimax optimization is fundamental and important for enormous machine learning applications such as generative adversarial network, adversarial training, and robust optimization. Recently, a variety of minimax algorithms with theoretical guarantees based on Lipschitz smoothness have been proposed. However, these algorithms could fail to converge in practice because the requisite Lipschitz smooth condition may not hold even in some classic minimax problems. We will present some counterexamples to reveal this divergence issue. Thus, to fill this gap, we are motivated to delve into the convergence analysis of minimax algorithms under a relaxed Lipschitz smoothness condition, i.e., generalized smoothness. We prove that variants of basic minimax optimization algorithms GDA, SGDA, GDmax and SGDmax can still converge in generalized smooth problems, and hence their theoretical guarantees can be extended to a wider range of applications. We also conduct a numerical experiment to validate the performance of our proposed algorithms.

Chat is not available.