Timezone: »
Forward modeling approaches in cosmology seek to reconstruct the initial conditions at the beginning of the Universe from the observed survey data.However the high dimensionality of the parameter space poses a challenge to explore the full posterior with traditional algorithms such as Hamiltonian Monte Carlo (HMC) and variational inference (VI).Here we develop a hybrid scheme called variational self-boosted sampling (VBS)that learns a variational approximation for the proposal distribution of HMC with samples generated on the fly, and in turn generates independent samples as proposals for MCMC chain to reduce their auto-correlation length. We use a normalizing flow with Fourier space convolutions as our variational distribution to scale to high dimensions of interest.We show that after a short initial warm-up and training phase, VBS generates better quality of samples than simple VI and reduces the correlation length in the sampling phase by a factor of 10-50 over using only HMC.
Author Information
Chirag Modi (Flatiron Institute)
Yin Li (Flatiron Institute)
David Blei (Columbia University)
David Blei is a Professor of Statistics and Computer Science at Columbia University, and a member of the Columbia Data Science Institute. His research is in statistical machine learning, involving probabilistic topic models, Bayesian nonparametric methods, and approximate posterior inference algorithms for massive data. He works on a variety of applications, including text, images, music, social networks, user behavior, and scientific data. David has received several awards for his research, including a Sloan Fellowship (2010), Office of Naval Research Young Investigator Award (2011), Presidential Early Career Award for Scientists and Engineers (2011), Blavatnik Faculty Award (2013), and ACM-Infosys Foundation Award (2013). He is a fellow of the ACM.
More from the Same Authors
-
2022 : Optimization-based Causal Estimation from Heterogenous Environments »
Mingzhang Yin · Yixin Wang · David Blei -
2023 : Causal-structure Driven Augmentations for Text OOD Generalization »
Amir Feder · Yoav Wald · Claudia Shi · Suchi Saria · David Blei -
2023 : Practical and Asymptotically Exact Conditional Sampling in Diffusion Models »
Brian Trippe · Luhuan Wu · Christian Naesseth · David Blei · John Cunningham -
2023 : SimBIG: Field-level Simulation-based Inference of Large-scale Structure »
Pablo Lemos · Liam Parker · ChangHoon Hahn · Bruno Régaldo-Saint Blancard · Elena Massara · Shirley Ho · David Spergel · Chirag Modi · Azadeh Moradinezhad Dizgah · Michael Eickenberg · Jiamin Hou -
2023 : SimBIG: Galaxy Clustering beyond the Power Spectrum »
ChangHoon Hahn · Pablo Lemos · Bruno Régaldo-Saint Blancard · Liam Parker · Michael Eickenberg · Shirley Ho · Jiamin Hou · Elena Massara · Chirag Modi · Azadeh Moradinezhad Dizgah · David Spergel -
2023 : FLORAH: A generative model for halo assembly histories »
Tri Nguyen · Chirag Modi · Rachel Somerville · L. Y. Aaron Yung -
2023 : Field-Level Inference with Microcanonical Langevin Monte Carlo »
Adrian Bayer · Uros Seljak · Chirag Modi -
2022 Poster: Variational Inference for Infinitely Deep Neural Networks »
Achille Nazaret · David Blei -
2022 Spotlight: Variational Inference for Infinitely Deep Neural Networks »
Achille Nazaret · David Blei -
2021 Poster: Unsupervised Representation Learning via Neural Activation Coding »
Yookoon Park · Sangho Lee · Gunhee Kim · David Blei -
2021 Poster: A Proxy Variable View of Shared Confounding »
Yixin Wang · David Blei -
2021 Spotlight: A Proxy Variable View of Shared Confounding »
Yixin Wang · David Blei -
2021 Oral: Unsupervised Representation Learning via Neural Activation Coding »
Yookoon Park · Sangho Lee · Gunhee Kim · David Blei -
2018 Poster: Noisin: Unbiased Regularization for Recurrent Neural Networks »
Adji Bousso Dieng · Rajesh Ranganath · Jaan Altosaar · David Blei -
2018 Oral: Noisin: Unbiased Regularization for Recurrent Neural Networks »
Adji Bousso Dieng · Rajesh Ranganath · Jaan Altosaar · David Blei -
2018 Poster: Augment and Reduce: Stochastic Inference for Large Categorical Distributions »
Francisco Ruiz · Michalis Titsias · Adji Bousso Dieng · David Blei -
2018 Poster: Black Box FDR »
Wesley Tansey · Yixin Wang · David Blei · Raul Rabadan -
2018 Oral: Augment and Reduce: Stochastic Inference for Large Categorical Distributions »
Francisco Ruiz · Michalis Titsias · Adji Bousso Dieng · David Blei -
2018 Oral: Black Box FDR »
Wesley Tansey · Yixin Wang · David Blei · Raul Rabadan -
2017 Workshop: Implicit Generative Models »
Rajesh Ranganath · Ian Goodfellow · Dustin Tran · David Blei · Balaji Lakshminarayanan · Shakir Mohamed -
2017 Poster: Robust Probabilistic Modeling with Bayesian Data Reweighting »
Yixin Wang · Alp Kucukelbir · David Blei -
2017 Poster: Evaluating Bayesian Models with Posterior Dispersion Indices »
Alp Kucukelbir · Yixin Wang · David Blei -
2017 Poster: Zero-Inflated Exponential Family Embeddings »
Liping Liu · David Blei -
2017 Talk: Zero-Inflated Exponential Family Embeddings »
Liping Liu · David Blei -
2017 Talk: Evaluating Bayesian Models with Posterior Dispersion Indices »
Alp Kucukelbir · Yixin Wang · David Blei -
2017 Talk: Robust Probabilistic Modeling with Bayesian Data Reweighting »
Yixin Wang · Alp Kucukelbir · David Blei