Timezone: »
Along with the recent advances in scalable Markov Chain Monte Carlo methods, sampling techniques that are based on Langevin diffusions have started receiving increasing attention. These so called Langevin Monte Carlo (LMC) methods are based on diffusions driven by a Brownian motion, which gives rise to Gaussian proposal distributions in the resulting algorithms. Even though these approaches have proven successful in many applications, their performance can be limited by the light-tailed nature of the Gaussian proposals. In this study, we extend classical LMC and develop a novel Fractional LMC (FLMC) framework that is based on a family of heavy-tailed distributions, called alpha-stable Levy distributions. As opposed to classical approaches, the proposed approach can possess large jumps while targeting the correct distribution, which would be beneficial for efficient exploration of the state space. We develop novel computational methods that can scale up to large-scale problems and we provide formal convergence analysis of the proposed scheme. Our experiments support our theory: FLMC can provide superior performance in multi-modal settings, improved convergence rates, and robustness to algorithm parameters.
Author Information
Umut Simsekli (Telecom ParisTech)
Related Events (a corresponding poster, oral, or spotlight)
-
2017 Poster: Fractional Langevin Monte Carlo: Exploring Levy Driven Stochastic Differential Equations for MCMC »
Wed. Aug 9th 08:30 AM -- 12:00 PM Room Gallery #21
More from the Same Authors
-
2020 Poster: Fractional Underdamped Langevin Dynamics: Retargeting SGD with Momentum under Heavy-Tailed Gradient Noise »
Umut Simsekli · Lingjiong Zhu · Yee-Whye Teh · Mert Gurbuzbalaban -
2019 Poster: Non-Asymptotic Analysis of Fractional Langevin Monte Carlo for Non-Convex Optimization »
Thanh Huy Nguyen · Umut Simsekli · Gaël RICHARD -
2019 Poster: A Tail-Index Analysis of Stochastic Gradient Noise in Deep Neural Networks »
Umut Simsekli · Levent Sagun · Mert Gurbuzbalaban -
2019 Poster: Sliced-Wasserstein Flows: Nonparametric Generative Modeling via Optimal Transport and Diffusions »
Antoine Liutkus · Umut Simsekli · Szymon Majewski · Alain Durmus · Fabian-Robert Stöter -
2019 Oral: A Tail-Index Analysis of Stochastic Gradient Noise in Deep Neural Networks »
Umut Simsekli · Levent Sagun · Mert Gurbuzbalaban -
2019 Oral: Non-Asymptotic Analysis of Fractional Langevin Monte Carlo for Non-Convex Optimization »
Thanh Huy Nguyen · Umut Simsekli · Gaël RICHARD -
2019 Oral: Sliced-Wasserstein Flows: Nonparametric Generative Modeling via Optimal Transport and Diffusions »
Antoine Liutkus · Umut Simsekli · Szymon Majewski · Alain Durmus · Fabian-Robert Stöter -
2018 Poster: Asynchronous Stochastic Quasi-Newton MCMC for Non-Convex Optimization »
Umut Simsekli · Cagatay Yildiz · Thanh Huy Nguyen · Ali Taylan Cemgil · Gaël RICHARD -
2018 Oral: Asynchronous Stochastic Quasi-Newton MCMC for Non-Convex Optimization »
Umut Simsekli · Cagatay Yildiz · Thanh Huy Nguyen · Ali Taylan Cemgil · Gaël RICHARD