Poster
Equivariant Flows: Exact Likelihood Generative Learning for Symmetric Densities
Jonas Köhler · Leon Klein · Frank Noe
Keywords: [ Architectures ] [ Deep Generative Models ] [ Generative Models ] [ Unsupervised Learning ] [ Deep Learning - Generative Models and Autoencoders ]
Normalizing flows are exact-likelihood generative neural networks which approximately transform samples from a simple prior distribution to samples of the probability distribution of interest. Recent work showed that such generative models can be utilized in statistical mechanics to sample equilibrium states of many-body systems in physics and chemistry. To scale and generalize these results, it is essential that the natural symmetries in the probability density -- in physics defined by the invariances of the target potential -- are built into the flow.
We provide a theoretical sufficient criterion showing that the distribution generated by equivariant normalizing flows is invariant with respect to these symmetries by design. Furthermore, we propose building blocks for flows which preserve symmetries which are usually found in physical/chemical many-body particle systems. Using benchmark systems motivated from molecular physics, we demonstrate that those symmetry preserving flows can provide better generalization capabilities and sampling efficiency.