Skip to yearly menu bar Skip to main content


Spotlight

Dataset Dynamics via Gradient Flows in Probability Space

David Alvarez-Melis · Nicolo Fusi

[ ] [ Livestream: Visit Optimal Transport ] [ Paper ]
[ Paper ]

Abstract:

Various machine learning tasks, from generative modeling to domain adaptation, revolve around the concept of dataset transformation and manipulation. While various methods exist for transforming unlabeled datasets, principled methods to do so for labeled (e.g., classification) datasets are missing. In this work, we propose a novel framework for dataset transformation, which we cast as optimization over data-generating joint probability distributions. We approach this class of problems through Wasserstein gradient flows in probability space, and derive practical and efficient particle-based methods for a flexible but well-behaved class of objective functions. Through various experiments, we show that this framework can be used to impose constraints on classification datasets, adapt them for transfer learning, or to re-purpose fixed or black-box models to classify —with high accuracy— previously unseen datasets.

Chat is not available.