Timezone: »

 
Oral
Stochastic Variance-Reduced Hamilton Monte Carlo Methods
Difan Zou · Pan Xu · Quanquan Gu

Thu Jul 12 05:00 AM -- 05:10 AM (PDT) @ A4
We propose a fast stochastic Hamilton Monte Carlo (HMC) method, for sampling from a smooth and strongly log-concave distribution. At the core of our proposed method is a variance reduction technique inspired by the recent advance in stochastic optimization. We show that, to achieve $\epsilon$ accuracy in 2-Wasserstein distance, our algorithm achieves $\tilde O\big(n+\kappa^{2}d^{1/2}/\epsilon+\kappa^{4/3}d^{1/3}n^{2/3}/\epsilon^{2/3}\big)$ gradient complexity (i.e., number of component gradient evaluations), which outperforms the state-of-the-art HMC and stochastic gradient HMC methods in a wide regime. We also extend our algorithm for sampling from smooth and general log-concave distributions, and prove the corresponding gradient complexity as well. Experiments on both synthetic and real data demonstrate the superior performance of our algorithm.

Author Information

Difan Zou (University of Virginia)
Pan Xu (University of California, Los Angeles)
Quanquan Gu (UCLA)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors