Timezone: »
In this paper, we study the generative models of sequential discrete data. To tackle the exposure bias problem inherent in maximum likelihood estimation (MLE), generative adversarial networks (GANs) are introduced to penalize the unrealistic generated samples. To exploit the supervision signal from the discriminator, most previous models leverage REINFORCE to address the non-differentiable problem of sequential discrete data. However, because of the unstable property of the training signal during the dynamic process of adversarial training, the effectiveness of REINFORCE, in this case, is hardly guaranteed. To deal with such a problem, we propose a novel approach called Cooperative Training (CoT) to improve the training of sequence generative models. CoT transforms the min-max game of GANs into a joint maximization framework and manages to explicitly estimate and optimize Jensen-Shannon divergence. Moreover, CoT works without the necessity of pre-training via MLE, which is crucial to the success of previous methods. In the experiments, compared to existing state-of-the-art methods, CoT shows superior or at least competitive performance on sample quality, diversity, as well as training stability.
Author Information
Sidi Lu (Shanghai Jiao Tong University)
Lantao Yu (Stanford University)
Siyuan Feng (Apex Data & Knowledge Management Lab, Shanghai Jiao Tong University)
Yaoming Zhu (Apex Data & Knowledge Management Lab, Shanghai Jiao Tong University)
Weinan Zhang (Apex Data & Knowledge Management Lab, Shanghai Jiao Tong University)
Related Events (a corresponding poster, oral, or spotlight)
-
2019 Oral: CoT: Cooperative Training for Generative Modeling of Discrete Data »
Thu. Jun 13th 06:30 -- 06:35 PM Room Hall B
More from the Same Authors
-
2022 Poster: A General Recipe for Likelihood-free Bayesian Optimization »
Jiaming Song · Lantao Yu · Willie Neiswanger · Stefano Ermon -
2022 Oral: A General Recipe for Likelihood-free Bayesian Optimization »
Jiaming Song · Lantao Yu · Willie Neiswanger · Stefano Ermon -
2020 Poster: Training Deep Energy-Based Models with f-Divergence Minimization »
Lantao Yu · Yang Song · Jiaming Song · Stefano Ermon -
2019 Poster: Multi-Agent Adversarial Inverse Reinforcement Learning »
Lantao Yu · Jiaming Song · Stefano Ermon -
2019 Poster: Neurally-Guided Structure Inference »
Sidi Lu · Jiayuan Mao · Josh Tenenbaum · Jiajun Wu -
2019 Oral: Neurally-Guided Structure Inference »
Sidi Lu · Jiayuan Mao · Josh Tenenbaum · Jiajun Wu -
2019 Oral: Multi-Agent Adversarial Inverse Reinforcement Learning »
Lantao Yu · Jiaming Song · Stefano Ermon