Skip to yearly menu bar Skip to main content


Poster
in
Workshop: “Could it have been different?” Counterfactuals in Minds and Machines

Natural Counterfactuals With Necessary Backtracking

Guangyuan Hao · Jiji Zhang · Hao Wang · Kun Zhang


Abstract:

Counterfactual reasoning, a cognitive ability possessed by humans, is being actively studied for incorporation into machine learning systems. In the causal modelling approach to counterfactuals, Judea Pearl's theory remains the most influential and dominant. However, being thoroughly non-backtracking, the counterfactual probability distributions defined by Pearl can be hard to learn by non-parametric models, even when the causal structure is fully given. A big challenge is that non-backtracking counterfactuals can easily step outside of the support of the training data, the inference of which becomes highly unreliable with the current machine learning models. To mitigate this issue, we propose an alternative theory of counterfactuals, namely, natural counterfactuals. This theory is concerned with counterfactuals within the support of the data distribution, and defines in a principled way a different kind of counterfactual that backtracks if (but only if) necessary. To demonstrate potential applications of the theory and illustrate the advantages of natural counterfactuals, we conduct a case study of counterfactual generation and discuss empirical observations that lend support to our approach.

Chat is not available.