Timezone: »
Stochastic variance reduction algorithms have recently become popular for minimizing the average of a large, but finite number of loss functions on a Riemannian manifold. The present paper proposes a Riemannian stochastic recursive gradient algorithm (R-SRG), which does not require the inverse of retraction between two distant iterates on the manifold. Convergence analyses of R-SRG are performed on both retraction-convex and non-convex functions under computationally efficient retraction and vector transport operations. The key challenge is analysis of the influence of vector transport along the retraction curve. Numerical evaluations reveal that R-SRG competes well with state-of-the-art Riemannian batch and stochastic gradient algorithms.
Author Information
Hiroyuki Kasai (The University of Electro-Communications)
Hiroyuki Sato (Kyoto University)
Bamdev Mishra (Microsoft)
Related Events (a corresponding poster, oral, or spotlight)
-
2018 Oral: Riemannian Stochastic Recursive Gradient Algorithm with Retraction and Vector Transport and Its Convergence Analysis »
Thu Jul 12th 03:30 -- 03:40 PM Room A9
More from the Same Authors
-
2019 Poster: Riemannian adaptive stochastic gradient algorithms on matrix manifolds »
Hiroyuki Kasai · Pratik Kumar Jawanpuria · Bamdev Mishra -
2019 Oral: Riemannian adaptive stochastic gradient algorithms on matrix manifolds »
Hiroyuki Kasai · Pratik Kumar Jawanpuria · Bamdev Mishra -
2018 Poster: A Unified Framework for Structured Low-rank Matrix Learning »
Pratik Kumar Jawanpuria · Bamdev Mishra -
2018 Oral: A Unified Framework for Structured Low-rank Matrix Learning »
Pratik Kumar Jawanpuria · Bamdev Mishra