Home  | Publications | QCM+24

Learning Counterfactually Invariant Predictors

MCML Authors

Abstract

Notions of counterfactual invariance (CI) have proven essential for predictors that are fair, robust, and generalizable in the real world. We propose graphical criteria that yield a sufficient condition for a predictor to be counterfactually invariant in terms of a conditional independence in the observational distribution. In order to learn such predictors, we propose a model-agnostic framework, called Counterfactually Invariant Prediction (CIP), building on the Hilbert-Schmidt Conditional Independence Criterion (HSCIC), a kernel-based conditional dependence measure. Our experimental results demonstrate the effectiveness of CIP in enforcing counterfactual invariance across various simulated and real-world datasets including scalar and multi-variate settings.

article


Transactions on Machine Learning Research

Jul. 2024.

Authors

F. Quinzan • C. Casolo • K. Muandet • Y. Luo • N. Kilbertus

Links

URL

Research Area

 A3 | Computational Models

BibTeXKey: QCM+24

Back to Top