Timezone: »

Federated Minimax Optimization: Improved Convergence Analyses and Algorithms
PRANAY SHARMA · Rohan Panda · Gauri Joshi · Pramod K Varshney

Tue Jul 19 03:30 PM -- 05:30 PM (PDT) @ Hall E #605

In this paper, we consider nonconvex minimax optimization, which is gaining prominence in many modern machine learning applications, such as GANs. Large-scale edge-based collection of training data in these applications calls for communication-efficient distributed optimization algorithms, such as those used in federated learning, to process the data. In this paper, we analyze local stochastic gradient descent ascent (SGDA), the local-update version of the SGDA algorithm. SGDA is the core algorithm used in minimax optimization, but it is not well-understood in a distributed setting. We prove that Local SGDA has \textit{order-optimal} sample complexity for several classes of nonconvex-concave and nonconvex-nonconcave minimax problems, and also enjoys \textit{linear speedup} with respect to the number of clients. We provide a novel and tighter analysis, which improves the convergence and communication guarantees in the existing literature. For nonconvex-PL and nonconvex-one-point-concave functions, we improve the existing complexity results for centralized minimax problems. Furthermore, we propose a momentum-based local-update algorithm, which has the same convergence guarantees, but outperforms Local SGDA as demonstrated in our experiments.

Author Information


I am a postdoctoral researcher in the Dept. of Electrical and Computer Engineering, at Carnegie Mellon University. I'm working with Prof. Gauri Joshi. In August 2021, I finished my Ph.D. in Electrical Engineering and Computer Science at Syracuse University. My advisor was Prof. Pramod K. Varshney. I finished my B.Tech-M.Tech dual-degree in Electrical Engineering from IIT Kanpur.

Rohan Panda (Carnegie Mellon University)
Gauri Joshi (Carnegie Mellon University)
Pramod K Varshney (Syracuse University)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors