Skip to yearly menu bar Skip to main content


Oral
in
Workshop: Structured Probabilistic Inference and Generative Modeling

Generative Marginalization Models

Sulin Liu · Peter Ramadge · Ryan P. Adams

Keywords: [ discrete generative models ] [ self-consistency ] [ Probabilistic Models ] [ marginalization ]


Abstract:

We introduce marginalization models, a new family of generative model for high-dimensional discrete data. They offer scalable and flexible generative modeling with tractable likelihoods through explicit modeling of all induced marginal distributions. Marginalization models enable fast evaluation of arbitrary marginal probabilities with a single forward pass of the neural network, which overcomes a major limitation of methods with exact marginal inference such as autoregressive models (ARMs). They also support scalable training for any-order generative modeling that previous methods fail to achieve under the setting of distribution matching to a given desired probability (specified by an unnormalized probability function such as energy function or reward function). We demonstrate the effectiveness of the proposed model on a variety of discrete data distributions, including binary images, language, physical systems, and molecules, on both likelihood maximization and distribution matching tasks. Marginalization models achieve orders of magnitude speedup in evaluation of the probability mass function. For distribution matching, marginalization models enable scalable training of any-order generative models that previous methods fail to achieve.

Chat is not available.