Skip to yearly menu bar Skip to main content


Poster

ButterflyFlow: Building Invertible Layers with Butterfly Matrices

Chenlin Meng · Linqi Zhou · Kristy Choi · Tri Dao · Stefano Ermon

Hall E #322

Keywords: [ DL: Algorithms ] [ DL: Everything Else ] [ DL: Generative Models and Autoencoders ]


Abstract:

Normalizing flows model complex probability distributions using maps obtained by composing invertible layers. Special linear layers such as masked and 1×1 convolutions play a key role in existing architectures because they increase expressive power while having tractable Jacobians and inverses. We propose a new family of invertible linear layers based on butterfly layers, which are known to theoretically capture complex linear structures including permutations and periodicity, yet can be inverted efficiently. This representational power is a key advantage of our approach, as such structures are common in many real-world datasets. Based on our invertible butterfly layers, we construct a new class of normalizing flow mod- els called ButterflyFlow. Empirically, we demonstrate that ButterflyFlows not only achieve strong density estimation results on natural images such as MNIST, CIFAR-10, and ImageNet-32×32, but also obtain significantly better log-likelihoods on structured datasets such as galaxy images and MIMIC-III patient cohorts—all while being more efficient in terms of memory and computation than relevant baselines.

Chat is not available.