Skip to yearly menu bar Skip to main content


Poster

SILVER: Single-loop variance reduction and application to federated learning

Kazusato Oko · Shunta Akiyama · Denny Wu · Tomoya Murata · Taiji Suzuki

Hall C 4-9 #2803
[ ] [ Paper PDF ]
Wed 24 Jul 2:30 a.m. PDT — 4 a.m. PDT

Abstract:

Most variance reduction methods require multiple times of full gradient computation, which is time-consuming and hence a bottleneck in application to distributed optimization. We present a single-loop variance-reduced gradient estimator named SILVER (SIngle-Loop VariancE-Reduction) for the finite-sum non-convex optimization, which does not require multiple full gradients but nevertheless achieves the optimal gradient complexity. Notably, unlike existing methods, SILVER provably reaches second-order optimality, with exponential convergence in the Polyak-Ɓojasiewicz (PL) region, and achieves further speedup depending on the data heterogeneity. Owing to these advantages, SILVER serves as a new base method to design communication-efficient federated learning algorithms: we combine SILVER with local updates which gives the best communication rounds and number of communicated gradients across all range of Hessian heterogeneity, and, at the same time, guarantees second-order optimality and exponential convergence in the PL region.

Chat is not available.