Timezone: »

 
Out of the Ordinary: Spectrally Adapting Regression for Covariate Shift
Benjamin Eyre · Elliot Creager · David Madras · Vardan Papyan · Richard Zemel
Event URL: https://openreview.net/forum?id=aJqJrekiNi »

Designing deep neural network classifiers that perform robustly on distributions differing from the available training data is an active area of machine learning research. However, out-of-distribution generalization for regression---the analogous problem for modeling continuous targets---remains relatively unexplored. To tackle this problem, we return to first principles and analyze how the closed-form solution for ordinary least squares (OLS) regression is sensitive to covariate shift. We characterize the out-of-distribution risk of the OLS model in terms of the eigenspectrum decomposition of the source and target data. We then use this insight to propose a method for adapting the weights of the last layer of a pre-trained neural regression model to perform better on input data originating from a different distribution. We demonstrate how this lightweight spectral adaptation procedure can improve out-of-distribution performance in a suite of both synthetic and real-world experiments.

Author Information

Benjamin Eyre (University of Toronto, Vector Institute)

I am a master's student at the University of Toronto where I am fortunate to be supervised by Professors Richard Zemel and Vardan Papyan. I am interested in researching techniques for creating learnt representations that are robust, explainable, and fair. I am also interested in the training dynamics at play when producing these representations.

Elliot Creager (University of Toronto)
David Madras (University of Toronto)
Vardan Papyan (University of Toronto)
Richard Zemel (Columbia University)

More from the Same Authors