Poster
Relaxing Bijectivity Constraints with Continuously Indexed Normalising Flows
Rob Cornish · Anthony Caterini · George Deligiannidis · Arnaud Doucet
Virtual
Keywords: [ Architectures ] [ Deep Generative Models ] [ Generative Models ] [ Unsupervised Learning ] [ Deep Learning - Generative Models and Autoencoders ]
We show that normalising flows become pathological when used to model targets whose supports have complicated topologies. In this scenario, we prove that a flow must become arbitrarily numerically noninvertible in order to approximate the target closely. This result has implications for all flow-based models, and especially residual flows (ResFlows), which explicitly control the Lipschitz constant of the bijection used. To address this, we propose continuously indexed flows (CIFs), which replace the single bijection used by normalising flows with a continuously indexed family of bijections, and which can intuitively "clean up" mass that would otherwise be misplaced by a single bijection. We show theoretically that CIFs are not subject to the same topological limitations as normalising flows, and obtain better empirical performance on a variety of models and benchmarks.