Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Topology, Algebra, and Geometry in Machine Learning

Neural Geometric Embedding Flows

Aaron Lou · Yang Song · Jiaming Song · Stefano Ermon


Abstract:

Embedding flows are novel normalizing flows which relax the bijectivity requirement while retaining many benefits such as left invertibility and exact log-likelihoods. As such, they can learn distributions residing on lower-dimensional submanifolds. However, previous research models data with simple normalizing flows like RealNVP applied to Euclidean subspace, ignoring much of the core geometric machinery and resulting in subpar modelling and pathological behavior. In this paper, we address these issues by connecting embedding flows and the field of extrinsic geometric flows. Using these insights, we introduce two geometrically-motivated embedding flows. First, to partially overcome topological mismatch, we show how to apply these models to arbitrary submanifolds. Second, we construct a continuous time embedding flow and show that the increased expressivity produces more accurate results.

Chat is not available.