Timezone: »

 
Oral
Geometric Losses for Distributional Learning
Arthur Mensch · Mathieu Blondel · Gabriel Peyré

Thu Jun 13 09:00 AM -- 09:20 AM (PDT) @ Room 103

Building upon recent advances in entropy-regularized optimal transport and upon Fenchel duality between measures and continuous functions, we propose in this paper a generalization of the logistic loss, incorporating a metric or cost between classes. Unlike previous attempts to use optimal transport distances for learning, our loss results in unconstrained convex objective functions, supports infinite (or very large) class spaces, and naturally defines a geometric generalization of the softmax operator. The geometric properties of this loss makes it suitable for predicting sparse and singular distributions, for instance supported on curves or hyper-surfaces. We study the theoretical properties of our loss and showcase its effectiveness on two applications: ordinal regression and drawing generation.

Author Information

Arthur Mensch (ENS)
Mathieu Blondel (NTT)
Gabriel Peyré (CNRS and ENS)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors