Skip to yearly menu bar Skip to main content


Poster

Recurrent Hierarchical Topic-Guided RNN for Language Generation

Dandan Guo · Bo Chen · Ruiying Lu · Mingyuan Zhou

Keywords: [ Probabilistic Inference - Models and Probabilistic Programming ] [ Deep Generative Models ] [ Bayesian Methods ] [ Approximate Inference ]


Abstract:

To simultaneously capture syntax and global semantics from a text corpus, we propose a new larger-context recurrent neural network (RNN) based language model, which extracts recurrent hierarchical semantic structure via a dynamic deep topic model to guide natural language generation. Moving beyond a conventional RNN-based language model that ignores long-range word dependencies and sentence order, the proposed model captures not only intra-sentence word dependencies, but also temporal transitions between sentences and inter-sentence topic dependencies. For inference, we develop a hybrid of stochastic-gradient Markov chain Monte Carlo and recurrent autoencoding variational Bayes. Experimental results on a variety of real-world text corpora demonstrate that the proposed model not only outperforms larger-context RNN-based language models, but also learns interpretable recurrent multilayer topics and generates diverse sentences and paragraphs that are syntactically correct and semantically coherent.

Chat is not available.