Timezone: »

Statistical Learning under Heterogenous Distribution Shift
Max Simchowitz · Anurag Ajay · Pulkit Agrawal · Akshay Krishnamurthy

Thu Jul 27 01:30 PM -- 03:00 PM (PDT) @ Exhibit Hall 1 #334
This paper studies the prediction of a target $\mathbf{z}$ from a pair of random variables $(\mathbf{x},\mathbf{y})$, where the ground-truth predictor is additive $\mathbb{E}[\mathbf{z} \mid \mathbf{x},\mathbf{y}] = f_\star(\mathbf{x}) +g_{\star}(\mathbf{y})$. We study the performance of empirical risk minimization (ERM) over functions $f+g$, $f \in \mathcal{F}$ and $g \in \mathcal{G}$, fit on a given training distribution, but evaluated on a test distribution which exhibits covariate shift. We show that, when the class $\mathcal{F}$ is "simpler" than $\mathcal{G}$ (measured, e.g., in terms of its metric entropy), our predictor is more resilient to *heterogenous covariate shifts* in which the shift in $\mathbf{x}$ is much greater than that in $\mathbf{y}$. These results rely on a novel Hölder style inequality for the Dudley integral which may be of independent interest. Moreover, we corroborate our theoretical findings with experiments demonstrating improved resilience to shifts in "simpler" features across numerous domains.

Author Information

Max Simchowitz (Massachusetts Institute of Technology)
Anurag Ajay (Massachusetts Institute of Technology)
Pulkit Agrawal (MIT)
Akshay Krishnamurthy (Microsoft)

More from the Same Authors