LS$^{2}$MC-GDA: A Smoothed Algorithm for Federated Stochastic Compositional Minimax Optimization
Xinwen Zhang ⋅ Richard Souvenir ⋅ Hongchang Gao
Abstract
Federated stochastic multi-level compositional minimax optimization supports a growing number of machine learning applications. However, the interplay of multi-level compositional structure, minimax formulation, and federated setting poses significant optimization challenges, resulting in slow convergence rates for existing algorithms. In this paper, we propose a novel federated algorithm, LS$^2$MC-GDA, that leverages smoothing techniques and variance reduced stochastic compositional gradients. To support our theoretical analysis, we introduce a stage-wise extension of LS$^2$MC-GDA, which serves to bridge the gap between different stationarity measures. We establish that our algorithm achieves a sample complexity of $O(\kappa^{3/2}/N\epsilon^3)$ and a communication complexity of $O(\kappa/\epsilon^2)$, substantially improving existing theoretical results in terms of the condition number $\kappa$ and the solution accuracy $\epsilon$ and achieving a linear speedup with respect to the number of workers $N$. Finally, experimental results validate the effectiveness of our approach.
Successful Page Load