Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Structured Probabilistic Inference and Generative Modeling

Nonparametric posterior normalizing flows

Sinead A Williamson · Evan Ott

Keywords: [ Normalizing flows ] [ bayesian ] [ nonparametric learning ]


Abstract:

Normalizing flows allow us to describe complex probability distributions, and can be used to perform flexible maximum likelihood density estimation (Dinh et al., 2014). Such maximum likelihood density estimation is likely to overfit, particularly if the number of observations is small. Traditional Bayesian approaches offer the prospect of capturing posterior uncertainty, but come at high computational cost and do not provide an intuitive way of incorporating prior information. A nonparametric learning approach (Lyddon et al., 2018) allows us to combine observed data with priors on the space of observations. We present a scalable approximate inference algorithm for nonparametric posterior normalizing flows, and show that the resulting distributions can yield improved generalization and uncertainty quantification.

Chat is not available.