Timezone: »
We propose a latent space energy-based prior model for text generation and classification. The model stands on a generator network that generates the text sequence based on a continuous latent vector. The energy term of the prior model couples a continuous latent vector and a symbolic one-hot vector, so that discrete category can be inferred from the observed example based on the continuous latent vector. Such a latent space coupling naturally enables incorporation of information bottleneck regularization to encourage the continuous latent vector to extract information from the observed example that is informative of the underlying category. In our learning method, the symbol-vector coupling, the generator network and the inference network are learned jointly. Our model can be learned in an unsupervised setting where no category labels are provided. It can also be learned in semi-supervised setting where category labels are provided for a subset of training examples. Our experiments demonstrate that the proposed model learns well-structured and meaningful latent space, which (1) guides the generator to generate text with high quality, diversity, and interpretability, and (2) effectively classifies text.
Author Information
Bo Pang (University of California Los Angeles)
Ying Nian Wu (UCLA)
Related Events (a corresponding poster, oral, or spotlight)
-
2021 Poster: Latent Space Energy-Based Model of Symbol-Vector Coupling for Text Generation and Classification »
Fri. Jul 23rd 04:00 -- 06:00 AM Room None
More from the Same Authors
-
2022 Poster: Latent Diffusion Energy-Based Model for Interpretable Text Modelling »
Peiyu Yu · Sirui Xie · Xiaojian Ma · Baoxiong Jia · Bo Pang · Ruiqi Gao · Yixin Zhu · Song-Chun Zhu · Ying Nian Wu -
2022 Spotlight: Latent Diffusion Energy-Based Model for Interpretable Text Modelling »
Peiyu Yu · Sirui Xie · Xiaojian Ma · Baoxiong Jia · Bo Pang · Ruiqi Gao · Yixin Zhu · Song-Chun Zhu · Ying Nian Wu -
2021 Workshop: ICML Workshop on Theoretic Foundation, Criticism, and Application Trend of Explainable AI »
Quanshi Zhang · Tian Han · Lixin Fan · Zhanxing Zhu · Hang Su · Ying Nian Wu -
2020 Poster: Closed Loop Neural-Symbolic Learning via Integrating Neural Perception, Grammar Parsing, and Symbolic Reasoning »
Qing Li · Siyuan Huang · Yining Hong · Yixin Chen · Ying Nian Wu · Song-Chun Zhu