Decentralized Riemannian Gradient Descent on the Stiefel Manifold

Shixiang Chen · Alfredo Garcia · Mingyi Hong · Shahin Shahrampour


Keywords: [ Computer Vision ] [ Distributed and Parallel Optimization ]

[ Abstract ]
[ Slides
[ Paper ]
[ Visit Poster at Spot D6 in Virtual World ]
Tue 20 Jul 9 a.m. PDT — 11 a.m. PDT
Spotlight presentation: Optimization 6
Thu 22 Jul 8:30 p.m. PDT — 9 p.m. PDT

Abstract: We consider a distributed non-convex optimization where a network of agents aims at minimizing a global function over the Stiefel manifold. The global function is represented as a finite sum of smooth local functions, where each local function is associated with one agent and agents communicate with each other over an undirected connected graph. The problem is non-convex as local functions are possibly non-convex (but smooth) and the Steifel manifold is a non-convex set. We present a decentralized Riemannian stochastic gradient method (DRSGD) with the convergence rate of $\mathcal{O}(1/\sqrt{K})$ to a stationary point. To have exact convergence with constant stepsize, we also propose a decentralized Riemannian gradient tracking algorithm (DRGTA) with the convergence rate of $\mathcal{O}(1/K)$ to a stationary point. We use multi-step consensus to preserve the iteration in the local (consensus) region. DRGTA is the first decentralized algorithm with exact convergence for distributed optimization on Stiefel manifold.

Chat is not available.