Timezone: »
It is known that the Langevin dynamics used in MCMC is the gradient flow of the KL divergence on the Wasserstein space, which helps convergence analysis and inspires recent particle-based variational inference methods (ParVIs). But no more MCMC dynamics is understood in this way. In this work, by developing novel concepts, we propose a theoretical framework that recognizes a general MCMC dynamics as the fiber-gradient Hamiltonian flow on the Wasserstein space of a fiber-Riemannian Poisson manifold. The ``conservation + convergence'' structure of the flow gives a clear picture on the behavior of general MCMC dynamics. We analyse existing MCMC instances under the framework. The framework also enables ParVI simulation of MCMC dynamics, which enriches the ParVI family with more efficient dynamics, and also adapts ParVI advantages to MCMCs. We develop two ParVI methods for a particular MCMC dynamics and demonstrate the benefits in experiments.
Author Information
Chang Liu (Tsinghua University)
Jingwei Zhuo (Tsinghua University)
Jun Zhu (Tsinghua University)
Related Events (a corresponding poster, oral, or spotlight)
-
2019 Poster: Understanding MCMC Dynamics as Flows on the Wasserstein Space »
Wed. Jun 12th 01:30 -- 04:00 AM Room Pacific Ballroom #213
More from the Same Authors
-
2022 Poster: Maximum Likelihood Training for Score-based Diffusion ODEs by High Order Denoising Score Matching »
Cheng Lu · Kaiwen Zheng · Fan Bao · Jianfei Chen · Chongxuan Li · Jun Zhu -
2022 Spotlight: Maximum Likelihood Training for Score-based Diffusion ODEs by High Order Denoising Score Matching »
Cheng Lu · Kaiwen Zheng · Fan Bao · Jianfei Chen · Chongxuan Li · Jun Zhu -
2022 Poster: Estimating the Optimal Covariance with Imperfect Mean in Diffusion Probabilistic Models »
Fan Bao · Chongxuan Li · Jiacheng Sun · Jun Zhu · Bo Zhang -
2022 Poster: GSmooth: Certified Robustness against Semantic Transformations via Generalized Randomized Smoothing »
Zhongkai Hao · Chengyang Ying · Yinpeng Dong · Hang Su · Jian Song · Jun Zhu -
2022 Spotlight: GSmooth: Certified Robustness against Semantic Transformations via Generalized Randomized Smoothing »
Zhongkai Hao · Chengyang Ying · Yinpeng Dong · Hang Su · Jian Song · Jun Zhu -
2022 Spotlight: Estimating the Optimal Covariance with Imperfect Mean in Diffusion Probabilistic Models »
Fan Bao · Chongxuan Li · Jiacheng Sun · Jun Zhu · Bo Zhang -
2020 Poster: Learning Optimal Tree Models under Beam Search »
Jingwei Zhuo · Ziru Xu · Wei Dai · Han Zhu · HAN LI · Jian Xu · Kun Gai -
2019 Poster: Scalable Training of Inference Networks for Gaussian-Process Models »
Jiaxin Shi · Mohammad Emtiyaz Khan · Jun Zhu -
2019 Poster: Understanding and Accelerating Particle-Based Variational Inference »
Chang Liu · Jingwei Zhuo · Pengyu Cheng · RUIYI (ROY) ZHANG · Jun Zhu -
2019 Poster: Variational Annealing of GANs: A Langevin Perspective »
Chenyang Tao · Shuyang Dai · Liqun Chen · Ke Bai · Junya Chen · Chang Liu · RUIYI (ROY) ZHANG · Georgiy Bobashev · Lawrence Carin -
2019 Oral: Understanding and Accelerating Particle-Based Variational Inference »
Chang Liu · Jingwei Zhuo · Pengyu Cheng · RUIYI (ROY) ZHANG · Jun Zhu -
2019 Oral: Scalable Training of Inference Networks for Gaussian-Process Models »
Jiaxin Shi · Mohammad Emtiyaz Khan · Jun Zhu -
2019 Oral: Variational Annealing of GANs: A Langevin Perspective »
Chenyang Tao · Shuyang Dai · Liqun Chen · Ke Bai · Junya Chen · Chang Liu · RUIYI (ROY) ZHANG · Georgiy Bobashev · Lawrence Carin -
2018 Poster: Message Passing Stein Variational Gradient Descent »
Jingwei Zhuo · Chang Liu · Jiaxin Shi · Jun Zhu · Ning Chen · Bo Zhang -
2018 Poster: Racing Thompson: an Efficient Algorithm for Thompson Sampling with Non-conjugate Priors »
Yichi Zhou · Jun Zhu · Jingwei Zhuo -
2018 Oral: Message Passing Stein Variational Gradient Descent »
Jingwei Zhuo · Chang Liu · Jiaxin Shi · Jun Zhu · Ning Chen · Bo Zhang -
2018 Oral: Racing Thompson: an Efficient Algorithm for Thompson Sampling with Non-conjugate Priors »
Yichi Zhou · Jun Zhu · Jingwei Zhuo