Skip to yearly menu bar Skip to main content


Spectral Frank-Wolfe Algorithm: Strict Complementarity and Linear Convergence

Lijun Ding · Yingjie Fei · Qiantong Xu · Chengrun Yang


Keywords: [ Optimization - Convex ] [ Large Scale Learning and Big Data ] [ Convex Optimization ]


We develop a novel variant of the classical Frank-Wolfe algorithm, which we call spectral Frank-Wolfe, for convex optimization over a spectrahedron. The spectral Frank-Wolfe algorithm has a novel ingredient: it computes a few eigenvectors of the gradient and solves a small-scale subproblem in each iteration. Such a procedure overcomes the slow convergence of the classical Frank-Wolfe algorithm due to ignoring eigenvalue coalescence. We demonstrate that strict complementarity of the optimization problem is key to proving linear convergence of various algorithms, such as the spectral Frank-Wolfe algorithm as well as the projected gradient method and its accelerated version. We showcase that the strict complementarity is equivalent to the eigengap assumption on the gradient at the optimal solution considered in the literature. As a byproduct of this observation, we also develop a generalized block Frank-Wolfe algorithm and prove its linear convergence.

Chat is not available.