Skip to yearly menu bar Skip to main content


Poster

Generative Conditional Distributions by Neural (Entropic) Optimal Transport

Bao Nguyen · Binh Nguyen · Trung Hieu Nguyen · Viet Anh Nguyen

Hall C 4-9 #407
[ ] [ Project Page ]
Tue 23 Jul 4:30 a.m. PDT — 6 a.m. PDT

Abstract:

Learning conditional distributions is challenging because the desired outcome is not a single distribution but multiple distributions that correspond to multiple instances of the covariates. We introduce a novel neural entropic optimal transport method designed to effectively learn generative models of conditional distributions, particularly in scenarios characterized by limited sample sizes. Our method relies on the minimax training of two neural networks: a generative network parametrizing the inverse cumulative distribution functions of the conditional distributions and another network parametrizing the conditional Kantorovich potential. To prevent overfitting, we regularize the objective function by penalizing the Lipschitz constant of the network output. Our experiments on real-world datasets show the effectiveness of our algorithm compared to state-of-the-art conditional distribution learning techniques. Our implementation can be found at https://github.com/nguyenngocbaocmt02/GENTLE.

Live content is unavailable. Log in and register to view live content