Timezone: »

Sequence to Better Sequence: Continuous Revision of Combinatorial Structures
Jonas Mueller · David Gifford · Tommi Jaakkola

Tue Aug 08 01:30 AM -- 05:00 AM (PDT) @ Gallery #114

We present a model that, after learning on observations of (sequence, outcome) pairs, can be efficiently used to revise a new sequence in order to improve its associated outcome. Our framework requires neither example improvements, nor additional evaluation of outcomes for proposed revisions. To avoid combinatorial-search over sequence elements, we specify a generative model with continuous latent factors, which is learned via joint approximate inference using a recurrent variational autoencoder (VAE) and an outcome-predicting neural network module. Under this model, gradient methods can be used to efficiently optimize the continuous latent factors with respect to inferred outcomes. By appropriately constraining this optimization and using the VAE decoder to generate a revised sequence, we ensure the revision is fundamentally similar to the original sequence, is associated with better outcomes, and looks natural. These desiderata are proven to hold with high probability under our approach, which is empirically demonstrated for revising natural language sentences.

Author Information

Jonas Mueller (MIT)
David Gifford (MIT)
Tommi Jaakkola (MIT)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors