Skip to yearly menu bar Skip to main content


Poster

Decentralized Convex Finite-Sum Optimization with Better Dependence on Condition Numbers

Yuxing Liu · Lesi Chen · Luo Luo


Abstract:

This paper studies decentralized optimization problem, where the local objective on each node is an average of a finite set of convex functions and the global function is strongly convex. We propose an efficient stochastic variance reduced first-order method that allows the different nodes to establish their stochastic local gradient estimator with different mini-batch sizes in per iteration. We prove the upper bound on the computation time of the proposed method contains the dependence on the global condition number, which is sharper than the previous results that only depend on the local condition numbers. Compared with the state-of-the-art methods, we also show that our method requires less local incremental first-order oracle calls and comparable communication cost. We further perform numerical experiments to validate the advantage of our method.

Live content is unavailable. Log in and register to view live content