Timezone: »
Spotlight
On a Combination of Alternating Minimization and Nesterov's Momentum
Sergey Guminov · Pavel Dvurechenskii · Nazarii Tupitsa · Alexander Gasnikov
Alternating minimization (AM) procedures are practically efficient in many applications for solving convex and non-convex optimization problems. On the other hand, Nesterov's accelerated gradient is theoretically optimal first-order method for convex optimization. In this paper we combine AM and Nesterov's acceleration to propose an accelerated alternating minimization algorithm. We prove $1/k^2$ convergence rate in terms of the objective for convex problems and $1/k$ in terms of the squared gradient norm for non-convex problems, where $k$ is the iteration counter. Our method does not require any knowledge of neither convexity of the problem nor function parameters such as Lipschitz constant of the gradient, i.e. it is adaptive to convexity and smoothness and is uniformly optimal for smooth convex and non-convex problems. Further, we develop its primal-dual modification for strongly convex problems with linear constraints and prove the same $1/k^2$ for the primal objective residual and constraints feasibility.
Author Information
Sergey Guminov (Higher School of Economics)
Pavel Dvurechenskii (Weierstrass Institute)
Graduated with honors from Moscow Institute of Physics and Technology. PhD on differential games in the same university. At the moment research associate in the area of optimization under inexact information in Berlin. Research interest include - algorithms for convex and non-convex large-scale optimization problems; - optimization under deterministic and stochastic inexact information; - randomized algorithms: random coordinate descent, random (derivative-free) directional search; - numerical aspects of Optimal Transport - Algorithms for saddle-point problems and variational inequalities
Nazarii Tupitsa (Institute for Information Transmission Problems)
Alexander Gasnikov (Moscow Institute of Physics and Technology)
Related Events (a corresponding poster, oral, or spotlight)
-
2021 Poster: On a Combination of Alternating Minimization and Nesterov's Momentum »
Wed. Jul 21st 04:00 -- 06:00 AM Room Virtual
More from the Same Authors
-
2023 : Kernel Mirror Prox and RKHS Gradient Flow for Mixed Functional Nash Equilibrium »
Pavel Dvurechenskii · Jia-Jie Zhu -
2023 : Kernel Mirror Prox and RKHS Gradient Flow for Mixed Functional Nash Equilibrium »
Pavel Dvurechenskii · Jia-Jie Zhu -
2023 Poster: High-Probability Bounds for Stochastic Optimization and Variational Inequalities: the Case of Unbounded Variance »
Abdurakhmon Sadiev · Marina Danilova · Eduard Gorbunov · Samuel Horváth · Gauthier Gidel · Pavel Dvurechenskii · Alexander Gasnikov · Peter Richtarik -
2023 Poster: Is Consensus Acceleration Possible in Decentralized Optimization over Slowly Time-Varying Networks? »
Dmitry Metelev · Alexander Rogozin · Dmitry Kovalev · Alexander Gasnikov -
2022 Poster: The power of first-order smooth optimization for black-box non-smooth problems »
Alexander Gasnikov · Anton Novitskii · Vasilii Novitskii · Farshed Abdukhakimov · Dmitry Kamzolov · Aleksandr Beznosikov · Martin Takac · Pavel Dvurechenskii · Bin Gu -
2022 Spotlight: The power of first-order smooth optimization for black-box non-smooth problems »
Alexander Gasnikov · Anton Novitskii · Vasilii Novitskii · Farshed Abdukhakimov · Dmitry Kamzolov · Aleksandr Beznosikov · Martin Takac · Pavel Dvurechenskii · Bin Gu -
2021 Poster: ADOM: Accelerated Decentralized Optimization Method for Time-Varying Networks »
Dmitry Kovalev · Egor Shulgin · Peter Richtarik · Alexander Rogozin · Alexander Gasnikov -
2021 Spotlight: ADOM: Accelerated Decentralized Optimization Method for Time-Varying Networks »
Dmitry Kovalev · Egor Shulgin · Peter Richtarik · Alexander Rogozin · Alexander Gasnikov -
2021 Poster: Newton Method over Networks is Fast up to the Statistical Precision »
Amir Daneshmand · Gesualdo Scutari · Pavel Dvurechenskii · Alexander Gasnikov -
2021 Spotlight: Newton Method over Networks is Fast up to the Statistical Precision »
Amir Daneshmand · Gesualdo Scutari · Pavel Dvurechenskii · Alexander Gasnikov -
2020 Poster: Self-Concordant Analysis of Frank-Wolfe Algorithms »
Pavel Dvurechenskii · Petr Ostroukhov · Kamil Safin · Shimrit Shtern · Mathias Staudigl -
2019 Poster: On the Complexity of Approximating Wasserstein Barycenters »
Alexey Kroshnin · Nazarii Tupitsa · Darina Dvinskikh · Pavel Dvurechenskii · Alexander Gasnikov · Cesar Uribe -
2019 Oral: On the Complexity of Approximating Wasserstein Barycenters »
Alexey Kroshnin · Nazarii Tupitsa · Darina Dvinskikh · Pavel Dvurechenskii · Alexander Gasnikov · Cesar Uribe -
2018 Poster: Computational Optimal Transport: Complexity by Accelerated Gradient Descent Is Better Than by Sinkhorn's Algorithm »
Pavel Dvurechenskii · Alexander Gasnikov · Alexey Kroshnin -
2018 Oral: Computational Optimal Transport: Complexity by Accelerated Gradient Descent Is Better Than by Sinkhorn's Algorithm »
Pavel Dvurechenskii · Alexander Gasnikov · Alexey Kroshnin