Skip to yearly menu bar Skip to main content


Poster

Asynchronous Decentralized Optimization With Implicit Stochastic Variance Reduction

Kenta Niwa · Guoqiang Zhang · W. Bastiaan Kleijn · Noboru Harada · Hiroshi Sawada · Akinori Fujino

Virtual

Keywords: [ Distributed and Parallel Optimization ]


Abstract:

A novel asynchronous decentralized optimization method that follows Stochastic Variance Reduction (SVR) is proposed. Average consensus algorithms, such as Decentralized Stochastic Gradient Descent (DSGD), facilitate distributed training of machine learning models. However, the gradient will drift within the local nodes due to statistical heterogeneity of the subsets of data residing on the nodes and long communication intervals. To overcome the drift problem, (i) Gradient Tracking-SVR (GT-SVR) integrates SVR into DSGD and (ii) Edge-Consensus Learning (ECL) solves a model constrained minimization problem using a primal-dual formalism. In this paper, we reformulate the update procedure of ECL such that it implicitly includes the gradient modification of SVR by optimally selecting a constraint-strength control parameter. Through convergence analysis and experiments, we confirmed that the proposed ECL with Implicit SVR (ECL-ISVR) is stable and approximately reaches the reference performance obtained with computation on a single-node using full data set.

Chat is not available.