Skip to yearly menu bar Skip to main content


Poster
in
Workshop: 2nd Annual Workshop on Topology, Algebra, and Geometry in Machine Learning (TAG-ML)

Sample Complexity Bounds for Estimating the Wasserstein Distance under Invariances

Behrooz Tahmasebi · Stefanie Jegelka


Abstract:

Group-invariant probability distributions appear in many data-generative models in machine learning, such as graphs, point clouds, and images. In practice, one often needs to estimate divergences between such distributions. In this work, we study how the inherent invariances with respect to any smooth action of a Lie group on a manifold improve sample complexity when estimating the Wasserstein distance. Our result indicates a two-fold gain: (1) reducing the sample complexity by a multiplicative factor corresponding to the group size (for finite groups) or the normalized volume of the quotient space (for groups of positive dimension), (2) improving the exponent in the convergence rate (for groups of positive dimension). These results are completely new for groups of positive dimension and tighten recent bounds for finite group actions.

Chat is not available.