Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Duality Principles for Modern Machine Learning

Estimating Joint interventional distributions from marginal interventional data

Sergio Garrido Mejia · Elke Kirschbaum · Armin Kekić · Atalanti Mastakouri

Keywords: [ Causality ] [ Joint interventional estimation ] [ Causal feature selection ] [ Lagrange duality ] [ MAXENT ]


Abstract:

We consider settings where only marginal datasets of a target variable and some treatment variable are available, but no joint observations of the target and all treatments together -- the so-called Causal Marginal Problem setting. In this paper we show how to exploit interventional data to acquire the joint conditional distribution of all the variables using the Maximum Entropy principle. To this end, we extend the Causal Maximum Entropy method to make use of interventional data in addition to observational data. Using Lagrange duality, we prove that the solution to the Causal Maximum Entropy problem with interventional constraints lies in the exponential family, as in the Maximum Entropy solution. Our method allows us to perform two tasks of interest when marginal interventional distributions are provided for all, or only some, of the variables. First, we show how to perform causal feature selection from a mixture of observational and single-variable interventional data, and, second, how to infer joint interventional distributions. For the former task, we show on synthetically generated data, that our proposed method outperforms the state-of-the-art method on merging datasets, and yields comparable results to the KCI-test which requires access to joint observations of all variables.

Chat is not available.