Timezone: »

Stochastic Gradient Monomial Gamma Sampler
Yizhe Zhang · Changyou Chen · Zhe Gan · Ricardo Henao · Lawrence Carin

Wed Aug 09 01:30 AM -- 05:00 AM (PDT) @ Gallery #83

Scaling Markov Chain Monte Carlo (MCMC) to estimate posterior distributions from large datasets has been made possible as a result of advances in stochastic gradient techniques. Despite their success, mixing performance of existing methods when sampling from multimodal distributions can be less efficient with insufficient Monte Carlo samples; this is evidenced by slow convergence and insufficient exploration of posterior distributions. We propose a generalized framework to improve the sampling efficiency of stochastic gradient MCMC, by leveraging a generalized kinetics that delivers superior stationary mixing, especially in multimodal distributions, and propose several techniques to overcome the practical issues. We show that the proposed approach is better at exploring a complicated multimodal posterior distribution, and demonstrate improvements over other stochastic gradient MCMC methods on various applications.

Author Information

Yizhe Zhang (Duke university)
Changyou Chen (Duke)
Zhe Gan (Duke University)
Ricardo Henao (Duke University)
Lawrence Carin (Duke)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors