Timezone: »

 
Poster
Spectral Frank-Wolfe Algorithm: Strict Complementarity and Linear Convergence
Lijun Ding · Yingjie Fei · Qiantong Xu · Chengrun Yang

Thu Jul 16 07:00 AM -- 07:45 AM & Thu Jul 16 06:00 PM -- 06:45 PM (PDT) @ Virtual #None

We develop a novel variant of the classical Frank-Wolfe algorithm, which we call spectral Frank-Wolfe, for convex optimization over a spectrahedron. The spectral Frank-Wolfe algorithm has a novel ingredient: it computes a few eigenvectors of the gradient and solves a small-scale subproblem in each iteration. Such a procedure overcomes the slow convergence of the classical Frank-Wolfe algorithm due to ignoring eigenvalue coalescence. We demonstrate that strict complementarity of the optimization problem is key to proving linear convergence of various algorithms, such as the spectral Frank-Wolfe algorithm as well as the projected gradient method and its accelerated version. We showcase that the strict complementarity is equivalent to the eigengap assumption on the gradient at the optimal solution considered in the literature. As a byproduct of this observation, we also develop a generalized block Frank-Wolfe algorithm and prove its linear convergence.

Author Information

Lijun Ding (Cornell University)
Tom Fei (Cornell University)
Qiantong Xu (Facebook AI Research)
Chengrun Yang (Cornell University)

I am a fourth year Ph.D. student at the School of Electrical and Computer Engineering at Cornell University. I am advised by Prof. Madeleine Udell, and also have Prof. Thorsten Joachims and Prof. Kilian Weinberger on my committee. My research interest lies broadly in optimization methods for machine learning, with focus on automated machine learning (AutoML) under resource constraints. I received a B.S. degree in physics from Fudan University in 2016.

More from the Same Authors