Timezone: »

 
Poster
Supervised Quantile Normalization for Low Rank Matrix Factorization
Marco Cuturi · Olivier Teboul · Jonathan Niles-Weed · Jean-Philippe Vert

Thu Jul 16 12:00 PM -- 12:45 PM & Fri Jul 17 01:00 AM -- 01:45 AM (PDT) @ None #None
Low rank matrix factorization is a fundamental building block in machine learning, used for instance to summarize gene expression profile data or word-document counts. To be robust to outliers and differences in scale across features, a matrix factorization step is usually preceded by ad-hoc feature normalization steps, such as tf-idf scaling or data whitening. We propose in this work to learn these normalization operators jointly with the factorization itself. More precisely, given a $d\times n$ matrix $X$ of $d$ features measured on $n$ individuals, we propose to learn the parameters of quantile normalization operators that can operate row-wise on the values of $X$ and/or of its factorization $UV$ to improve the quality of the low-rank representation of $X$ itself. This optimization is facilitated by the introduction of differentiable quantile normalization operators derived using regularized optimal transport algorithms.

Author Information

Marco Cuturi (Google)
Olivier Teboul (Google Brain)
Jonathan Niles-Weed (NYU)
JP Vert (Google)

More from the Same Authors