Timezone: »

 
Poster
A theory of continuous generative flow networks
Salem Lahlou · Tristan Deleu · Pablo Lemos · Dinghuai Zhang · Alexandra Volokhova · Alex Hernandez-Garcia · Lena Nehale Ezzine · Yoshua Bengio · Nikolay Malkin

Tue Jul 25 02:00 PM -- 03:30 PM (PDT) @ Exhibit Hall 1 #418

Generative flow networks (GFlowNets) are amortized variational inference algorithms that are trained to sample from unnormalized target distributions over compositional objects. A key limitation of GFlowNets until this time has been that they are restricted to discrete spaces. We present a theory for generalized GFlowNets, which encompasses both existing discrete GFlowNets and ones with continuous or hybrid state spaces, and perform experiments with two goals in mind. First, we illustrate critical points of the theory and the importance of various assumptions. Second, we empirically demonstrate how observations about discrete GFlowNets transfer to the continuous case and show strong results compared to non-GFlowNet baselines on several previously studied tasks. This work greatly widens the perspectives for the application of GFlowNets in probabilistic inference and various modeling settings.

Author Information

Salem Lahlou (Mila, Université de Montréal)
Tristan Deleu (Mila - Université de Montréal)
Pablo Lemos (Mila)
Dinghuai Zhang (Mila)
Alexandra Volokhova (Mila - Quebec AI Institute)
Alex Hernandez-Garcia (Mila - Quebec AI Institute)
Lena Nehale Ezzine (MILA - Quebec AI Institute)
Yoshua Bengio (Mila - Quebec AI Institute)
Nikolay Malkin (Mila / Université de Montréal)

More from the Same Authors