Poster
A Chance-Constrained Generative Framework for Sequence Optimization
Xianggen Liu · Qiang Liu · Sen Song · Jian Peng
Keywords: [ Deep Generative Models ] [ Deep Sequence Models ] [ Optimization ] [ Deep Learning - Generative Models and Autoencoders ]
Deep generative modeling has achieved many successes for continuous data generation, such as producing realistic images and controlling their properties (e.g., styles). However, the development of generative modeling techniques for optimizing discrete data, such as sequences or strings, still lags behind largely due to the challenges in modeling complex and long-range constraints, including both syntax and semantics, in discrete structures. In this paper, we formulate the sequence optimization task as a chance-constrained optimization problem. The key idea is to enforce a high probability of generating valid sequences and also optimize the property of interest. We propose a novel minmax algorithm to simultaneously tighten a bound of the valid chance and optimize the expected property. Extensive experimental results in three domains demonstrate the superiority of our approach over the existing sequence optimization methods.