Timezone: »

Monotonic Risk Relationships under Distribution Shifts for Regularized Risk Minimization
Daniel LeJeune · Jiayu Liu · Reinhard Heckel

Machine learning systems are often applied to data that is drawn from a different distribution than the training distribution. Recent work has shown that for a variety of classification and signal reconstruction problems, the out-of-distribution performance is strongly linearly correlated with the in-distribution performance. If this relationship or more generally a monotonic one holds, it has important consequences. For example, it allows to optimize performance on one distribution as a proxy for performance on the other. In this work, we study conditions under which a monotonic relationship between the performances of a model on two distributions is expected. We prove an exact asymptotic linear relation for squared error and a monotonic relation for misclassification error under a subspace shift model with feature scaling.

Author Information

Daniel LeJeune (Rice University)

I'm a Ph.D. student working under Richard Baraniuk in the DSP Group at Rice University. I'm interested in developing algorithms for solving machine learning and optimization problems.

Jiayu Liu (Technical University of Munich)
Reinhard Heckel (TU Munich)

More from the Same Authors