Timezone: »

 
Poster
Alternating Randomized Block Coordinate Descent
Jelena Diakonikolas · Orecchia Lorenzo

Thu Jul 12 09:15 AM -- 12:00 PM (PDT) @ Hall B #67

Block-coordinate descent algorithms and alternating minimization methods are fundamental optimization algorithms and an important primitive in large-scale optimization and machine learning. While various block-coordinate-descent-type methods have been studied extensively, only alternating minimization -- which applies to the setting of only two blocks -- is known to have convergence time that scales independently of the least smooth block. A natural question is then: is the setting of two blocks special? We show that the answer is ``no'' as long as the least smooth block can be optimized exactly -- an assumption that is also needed in the setting of alternating minimization. We do so by introducing a novel algorithm AR-BCD, whose convergence time scales independently of the least smooth (possibly non-smooth) block. The basic algorithm generalizes both alternating minimization and randomized block coordinate (gradient) descent, and we also provide its accelerated version -- AAR-BCD.

Author Information

Jelena Diakonikolas (Boston University)

Jelena Diakonikolas is a Postdoctoral Associate at Boston University. She completed her Ph.D. degree in electrical engineering at Columbia University. Her research interests include large-scale optimization with a focus on first-order methods and their applications in engineered and networked systems. She is a recipient of a Simons-Berkeley Research Fellowship (2018), the Morton B. Friedman Prize for Excellence at Columbia Engineering (2017), and a Qualcomm Innovation Fellowship (2015). In 2016, she was featured on the inaugural N^2 Women list of “10 Women in Networking/Communications That You Should Watch”.

Orecchia Lorenzo (Boston)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors