Skip to yearly menu bar Skip to main content


Poster

Relaxing Bijectivity Constraints with Continuously Indexed Normalising Flows

Rob Cornish · Anthony Caterini · George Deligiannidis · Arnaud Doucet

Virtual

Keywords: [ Deep Learning - Generative Models and Autoencoders ] [ Unsupervised Learning ] [ Generative Models ] [ Deep Generative Models ] [ Architectures ]


Abstract:

We show that normalising flows become pathological when used to model targets whose supports have complicated topologies. In this scenario, we prove that a flow must become arbitrarily numerically noninvertible in order to approximate the target closely. This result has implications for all flow-based models, and especially residual flows (ResFlows), which explicitly control the Lipschitz constant of the bijection used. To address this, we propose continuously indexed flows (CIFs), which replace the single bijection used by normalising flows with a continuously indexed family of bijections, and which can intuitively "clean up" mass that would otherwise be misplaced by a single bijection. We show theoretically that CIFs are not subject to the same topological limitations as normalising flows, and obtain better empirical performance on a variety of models and benchmarks.

Chat is not available.