Timezone: »

 
Poster
ADOM: Accelerated Decentralized Optimization Method for Time-Varying Networks
Dmitry Kovalev · Egor Shulgin · Peter Richtarik · Alexander Rogozin · Alexander Gasnikov

Thu Jul 22 09:00 PM -- 11:00 PM (PDT) @ Virtual

We propose ADOM -- an accelerated method for smooth and strongly convex decentralized optimization over time-varying networks. ADOM uses a dual oracle, i.e., we assume access to the gradient of the Fenchel conjugate of the individual loss functions. Up to a constant factor, which depends on the network structure only, its communication complexity is the same as that of accelerated Nesterov gradient method. To the best of our knowledge, only the algorithm of Rogozin et al. (2019) has a convergence rate with similar properties. However, their algorithm converges under the very restrictive assumption that the number of network changes can not be greater than a tiny percentage of the number of iterations. This assumption is hard to satisfy in practice, as the network topology changes usually can not be controlled. In contrast, ADOM merely requires the network to stay connected throughout time.

Author Information

Dmitry Kovalev (KAUST)
Egor Shulgin (KAUST and MIPT)

I am a PhD student in Computer Science at King Abdullah University of Science and Technology (KAUST) advised by [Petr Richtrik](https://richtarik.org/). Prior to that, I obtained BSc in Applied Mathematics, Computer Science, and Physics from Moscow Institute of Physics and Technology in 2019.

Peter Richtarik (KAUST)

Peter Richtarik is an Associate Professor of Computer Science and Mathematics at KAUST and an Associate Professor of Mathematics at the University of Edinburgh. He is an EPSRC Fellow in Mathematical Sciences, Fellow of the Alan Turing Institute, and is affiliated with the Visual Computing Center and the Extreme Computing Research Center at KAUST. Dr. Richtarik received his PhD from Cornell University in 2007, and then worked as a Postdoctoral Fellow in Louvain, Belgium, before joining Edinburgh in 2009, and KAUST in 2017. Dr. Richtarik's research interests lie at the intersection of mathematics, computer science, machine learning, optimization, numerical linear algebra, high performance computing and applied probability. Through his recent work on randomized decomposition algorithms (such as randomized coordinate descent methods, stochastic gradient descent methods and their numerous extensions, improvements and variants), he has contributed to the foundations of the emerging field of big data optimization, randomized numerical linear algebra, and stochastic methods for empirical risk minimization. Several of his papers attracted international awards, including the SIAM SIGEST Best Paper Award, the IMA Leslie Fox Prize (2nd prize, twice), and the INFORMS Computing Society Best Student Paper Award (sole runner up). He is the founder and organizer of the Optimization and Big Data workshop series.​

Alexander Rogozin (Moscow Institute of Physics and Technology)
Alexander Gasnikov (Moscow Institute of Physics and Technology)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors