Skip to yearly menu bar Skip to main content


Poster
in
Workshop: 2nd Workshop on Formal Verification of Machine Learning

Learning Counterfactually Invariant Predictors

Francesco Quinzan · Cecilia Casolo · Krikamol Muandet · Yucen Luo · Niki Kilbertus


Abstract:

Notions of counterfactual invariance have proven essential for predictors that are fair, robust, and generalizable in the real world. We propose simple graphical criteria that yield a sufficient condition for a predictor to be counterfactually invariant in terms of (conditional independence in) the observational distribution. Any predictor that satisfies our criterion is provably counterfactually invariant. In order to learn such predictors, we propose a model-agnostic framework, called Counterfactual Invariance Prediction (CIP), building on a kernel-based conditional dependence measure called Hilbert-Schmidt Conditional Independence Criterion (HSCIC). Our experimental results demonstrate the effectiveness of CIP in enforcing counterfactual invariance across various simulated and real-world datasets including scalar and multi-variate settings.

Chat is not available.