Skip to yearly menu bar Skip to main content


( events)   Timezone:  
Poster
Wed Jul 11 09:15 AM -- 12:00 PM (PDT) @ Hall B #187
Stochastic Variance-Reduced Cubic Regularized Newton Method
Dongruo Zhou · Pan Xu · Quanquan Gu
[ PDF
We propose a stochastic variance-reduced cubic regularized Newton method (SVRC) for non-convex optimization. At the core of our algorithm is a novel semi-stochastic gradient along with a semi-stochastic Hessian, which are specifically designed for cubic regularization method. We show that our algorithm is guaranteed to converge to an $(\epsilon,\sqrt{\epsilon})$-approximate local minimum within $\tilde{O}(n^{4/5}/\epsilon^{3/2})$ second-order oracle calls, which outperforms the state-of-the-art cubic regularization algorithms including subsampled cubic regularization. Our work also sheds light on the application of variance reduction technique to high-order non-convex optimization methods. Thorough experiments on various non-convex optimization problems support our theory.