Papers
Topics
Authors
Recent
2000 character limit reached

Scalable Out-of-distribution Robustness in the Presence of Unobserved Confounders (2411.19923v1)

Published 29 Nov 2024 in cs.LG and stat.ML

Abstract: We consider the task of out-of-distribution (OOD) generalization, where the distribution shift is due to an unobserved confounder ($Z$) affecting both the covariates ($X$) and the labels ($Y$). In this setting, traditional assumptions of covariate and label shift are unsuitable due to the confounding, which introduces heterogeneity in the predictor, i.e., $\hat{Y} = f_Z(X)$. OOD generalization differs from traditional domain adaptation by not assuming access to the covariate distribution ($X\text{te}$) of the test samples during training. These conditions create a challenging scenario for OOD robustness: (a) $Z\text{tr}$ is an unobserved confounder during training, (b) $P\text{te}{Z} \neq P\text{tr}{Z}$, (c) $X\text{te}$ is unavailable during training, and (d) the posterior predictive distribution depends on $P\text{te}(Z)$, i.e., $\hat{Y} = E_{P\text{te}(Z)}[f_Z(X)]$. In general, accurate predictions are unattainable in this scenario, and existing literature has proposed complex predictors based on identifiability assumptions that require multiple additional variables. Our work investigates a set of identifiability assumptions that tremendously simplify the predictor, whose resulting elegant simplicity outperforms existing approaches.

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Paper to Video (Beta)

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 2 tweets with 3 likes about this paper.