Skip to yearly menu bar Skip to main content


Poster

Supervised Quantile Normalization for Low Rank Matrix Factorization

Marco Cuturi · Olivier Teboul · Jonathan Niles-Weed · Jean-Philippe Vert

Keywords: [ General Machine Learning Techniques ] [ Matrix/Tensor Methods ] [ Unsupervised Learning ] [ Computational Biology and Genomics ]


Abstract: Low rank matrix factorization is a fundamental building block in machine learning, used for instance to summarize gene expression profile data or word-document counts. To be robust to outliers and differences in scale across features, a matrix factorization step is usually preceded by ad-hoc feature normalization steps, such as tf-idf scaling or data whitening. We propose in this work to learn these normalization operators jointly with the factorization itself. More precisely, given a $d\times n$ matrix $X$ of $d$ features measured on $n$ individuals, we propose to learn the parameters of quantile normalization operators that can operate row-wise on the values of $X$ and/or of its factorization $UV$ to improve the quality of the low-rank representation of $X$ itself. This optimization is facilitated by the introduction of differentiable quantile normalization operators derived using regularized optimal transport algorithms.

Chat is not available.