Skip to yearly menu bar Skip to main content


Invited talk
in
Workshop: Continuous Time Perspectives in Machine Learning

Generative Modeling with Stochastic Differential Equations

Stefano Ermon


Abstract:

Generative models are typically based on explicit representations of probability distributions (e.g., autoregressive or VAEs) or implicit sampling procedures (e.g., GANs). We propose an alternative approach based on modeling directly the vector field of gradients of the data distribution (scores). Our framework allows flexible architectures, requires no sampling during training or the use of adversarial training methods. Additionally, score-based generative models enable exact likelihood evaluation through connections with continuous time normalizing flows and stochastic differential equations. We produce samples comparable to GANs, achieving new state-of-the-art inception scores, and excellent likelihoods on image datasets.

Chat is not available.