Timezone: »
Block-coordinate descent algorithms and alternating minimization methods are fundamental optimization algorithms and an important primitive in large-scale optimization and machine learning. While various block-coordinate-descent-type methods have been studied extensively, only alternating minimization -- which applies to the setting of only two blocks -- is known to have convergence time that scales independently of the least smooth block. A natural question is then: is the setting of two blocks special? We show that the answer is ``no'' as long as the least smooth block can be optimized exactly -- an assumption that is also needed in the setting of alternating minimization. We do so by introducing a novel algorithm AR-BCD, whose convergence time scales independently of the least smooth (possibly non-smooth) block. The basic algorithm generalizes both alternating minimization and randomized block coordinate (gradient) descent, and we also provide its accelerated version -- AAR-BCD.
Author Information
Jelena Diakonikolas (Boston University)
Jelena Diakonikolas is a Postdoctoral Associate at Boston University. She completed her Ph.D. degree in electrical engineering at Columbia University. Her research interests include large-scale optimization with a focus on first-order methods and their applications in engineered and networked systems. She is a recipient of a Simons-Berkeley Research Fellowship (2018), the Morton B. Friedman Prize for Excellence at Columbia Engineering (2017), and a Qualcomm Innovation Fellowship (2015). In 2016, she was featured on the inaugural N^2 Women list of “10 Women in Networking/Communications That You Should Watch”.
Orecchia Lorenzo (Boston)
Related Events (a corresponding poster, oral, or spotlight)
-
2018 Oral: Alternating Randomized Block Coordinate Descent »
Thu. Jul 12th 12:30 -- 12:50 PM Room A9
More from the Same Authors
-
2018 Poster: On Acceleration with Noise-Corrupted Gradients »
Michael Cohen · Jelena Diakonikolas · Orecchia Lorenzo -
2018 Oral: On Acceleration with Noise-Corrupted Gradients »
Michael Cohen · Jelena Diakonikolas · Orecchia Lorenzo -
2017 Poster: Connected Subgraph Detection with Mirror Descent on SDPs »
Cem Aksoylar · Orecchia Lorenzo · Venkatesh Saligrama -
2017 Talk: Connected Subgraph Detection with Mirror Descent on SDPs »
Cem Aksoylar · Orecchia Lorenzo · Venkatesh Saligrama