Skip to yearly menu bar Skip to main content


Oral
in
Workshop: Structured Probabilistic Inference and Generative Modeling

BayesDAG: Gradient-Based Posterior Sampling for Causal Discovery

Yashas Annadani · Nick Pawlowski · Joel Jennings · Stefan Bauer · Cheng Zhang · Wenbo Gong

Keywords: [ structure learning ] [ causal discovery ] [ Bayesian Inference ] [ MCMC ] [ Variational Inference ]


Abstract:

Bayesian causal discovery aims to infer the posterior distribution over causal models from observed data, quantifying epistemic uncertainty and benefiting downstream tasks. However, computational challenges arise due to joint inference over combinatorial space of Directed Acyclic Graphs (DAGs) and nonlinear functions. In this work, we introduce a scalable Bayesian causal discovery framework based on stochastic gradient Markov Chain Monte Carlo (SG-MCMC) that directly samples DAGs from the posterior without any DAG regularization, simultaneously draws function parameter samples and is applicable to both linear and nonlinear causal models. To enable our approach, we derive a novel equivalence to the permutation-based DAG learning, which opens up possibilities of using any relaxed gradient estimator defined over permutations. To our knowledge, this is the first framework applying gradient-based MCMC sampling for causal discovery. Empirical evaluations on synthetic and real-world datasets demonstrate our approach's effectiveness compared to state-of-the-art baselines.

Chat is not available.