Timezone: »

Distribution Regression with Sliced Wasserstein Kernels
Dimitri Marie Meunier · Massimiliano Pontil · Carlo Ciliberto

Thu Jul 21 08:50 AM -- 08:55 AM (PDT) @ Room 327 - 329

The problem of learning functions over spaces of probabilities - or distribution regression - is gaining significant interest in the machine learning community. The main challenge in these settings is to identify a suitable representation capturing all relevant properties of a distribution. The well-established approach in this sense is to use kernel mean embeddings, which lift kernel-induced similarity on the input domain at the probability level. This strategy effectively tackles the two-stage sampling nature of the problem, enabling one to derive estimators with strong statistical guarantees, such as universal consistency and excess risk bounds. However, kernel mean embeddings implicitly hinge on the maximum mean discrepancy (MMD), a metric on probabilities, which is not the most suited to capture geometrical relations between distributions. In contrast, optimal transport (OT) metrics, are potentially more appealing. In this work, we propose an OT-based estimator for distribution regression. We build on the Sliced Wasserstein distance to obtain an OT-based representation. We study the theoretical properties of a kernel ridge regression estimator based on such representation, for which we prove universal consistency and excess risk bounds. Preliminary experiments complement our theoretical findings by showing the effectiveness of the proposed approach and compare it with MMD-based estimators.

Author Information

Dimitri Marie Meunier (UCL)
Massimiliano Pontil (Istituto Italiano di Tecnologia and University College London)


Carlo Ciliberto (University College London)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors