Skip to yearly menu bar Skip to main content


Poster

Adaptive Accelerated (Extra-)Gradient Methods with Variance Reduction

Zijian Liu · Ta Duy Nguyen · Alina Ene · Huy Nguyen

Hall E #708

Keywords: [ OPT: Convex ]


Abstract: In this paper, we study the finite-sum convex optimization problem focusing on the general convex case. Recently, the study of variance reduced (VR) methods and their accelerated variants has made exciting progress. However, the step size used in the existing VR algorithms typically depends on the smoothness parameter, which is often unknown and requires tuning in practice. To address this problem, we propose two novel adaptive VR algorithms: \textit{Adaptive Variance Reduced Accelerated Extra-Gradient} (AdaVRAE) and\textit{ Adaptive Variance Reduced Accelerated Gradient} (AdaVRAG). Our algorithms do not require knowledge of the smoothness parameter. AdaVRAE uses $\mathcal{O}\left(n\log\log n+\sqrt{\frac{n\beta}{\epsilon}}\right)$ and AdaVRAG uses $\mathcal{O}\left(n\log\log n+\sqrt{\frac{n\beta\log\beta}{\epsilon}}\right)$ gradient evaluations to attain an $\mathcal{O}(\epsilon)$-suboptimal solution, where $n$ is the number of functions in the finite sum and $\beta$ is the smoothness parameter. This result matches the best-known convergence rate of non-adaptive VR methods and it improves upon the convergence of the state of the art adaptive VR method, AdaSVRG. We demonstrate the superior performance of our algorithms comparedwith previous methods in experiments on real-world datasets.

Chat is not available.