Timezone: »

Dispersed Exponential Family Mixture VAEs for Interpretable Text Generation
Wenxian Shi · Hao Zhou · Ning Miao · Lei Li

Tue Jul 14 09:00 AM -- 09:45 AM & Tue Jul 14 08:00 PM -- 08:45 PM (PDT) @

Interpretability is important in text generation for guiding the generation with interpretable attributes. Variational auto-encoder (VAE) with Gaussian distribution as prior has been successfully applied in text generation, but it is hard to interpret the meaning of the latent variable. To enhance the controllability and interpretability, one can replace the Gaussian prior with a mixture of Gaussian distributions (GM-VAE), whose mixture components could be related to some latent attributes of data. Unfortunately, straightforward variational training of GM-VAE leads the mode-collapse problem. In this paper, we find that mode-collapse is a general problem for VAEs with exponential family mixture priors. We propose DEM-VAE, which introduces an extra dispersion term to induce a well-structured latent space. Experimental results show that our approach does obtain a well structured latent space, with which our method outperforms strong baselines in interpretable text generation benchmarks.

Author Information

Wenxian Shi (Bytedance)
Hao Zhou (Bytedance)
Ning Miao (ByteDance AI Lab)
Lei Li (ByteDance AI Lab)

More from the Same Authors