Timezone: »

 
Learning Counterfactually Invariant Predictors
Francesco Quinzan · Cecilia Casolo · Krikamol Muandet · Yucen Luo · Niki Kilbertus

Fri Jul 28 06:20 PM -- 06:50 PM (PDT) @

Notions of counterfactual invariance have proven essential for predictors that are fair, robust, and generalizable in the real world. We propose simple graphical criteria that yield a sufficient condition for a predictor to be counterfactually invariant in terms of (conditional independence in) the observational distribution. Any predictor that satisfies our criterion is provably counterfactually invariant. In order to learn such predictors, we propose a model-agnostic framework, called Counterfactual Invariance Prediction (CIP), building on a kernel-based conditional dependence measure called Hilbert-Schmidt Conditional Independence Criterion (HSCIC). Our experimental results demonstrate the effectiveness of CIP in enforcing counterfactual invariance across various simulated and real-world datasets including scalar and multi-variate settings.

Author Information

Francesco Quinzan (University of Oxford)
Cecilia Casolo (Helmholtz München)
Krikamol Muandet (CISPA--Helmholtz Center for Information Security)
Yucen Luo (Max Planck Institute for Intelligent Systems, Max-Planck Institute)

I am a postdoctoral researcher at Empirical Inference Department, Max Planck Institute for Intelligent Systems. My research interests are in data-efficient (causal) representation learning.

Niki Kilbertus (TUM & Helmholtz AI)

More from the Same Authors