Skip to yearly menu bar Skip to main content


Poster

Generative Conditional Distributions by Neural (Entropic) Optimal Transport

Bao Nguyen · Binh Nguyen · Hieu Nguyen · Viet Anh Nguyen


Abstract:

A fundamental challenge in conditional distribution learning arises from the necessity to acquire not merely a single distribution, but rather multiple distributions that correspond to multiple values of the covariates. In this work, we introduce a novel neural entropic optimal transport method designed to effectively learn generative models of conditional distributions, particularly in scenarios characterized by limited sample sizes. Our method relies on the minimax training of two neural networks: a network parametrizing the inverse cumulative distribution functions of the conditional distributions and another network parametrizing the conditional Kantorovich potential. To prevent overfitting, we regularize the objective function by penalizing the Lipschitz constant of the network output. Our experiments on real-world datasets show the effectiveness of our algorithm compared to state-of-the-art conditional distribution learning techniques.

Live content is unavailable. Log in and register to view live content