Skip to yearly menu bar Skip to main content


Poster

Learning to Scale Logits for Temperature-Conditional GFlowNets

Minsu Kim · Joohwan Ko · Taeyoung Yun · Dinghuai Zhang · Ling Pan · Woo Chang Kim · Jinkyoo Park · Emmanuel Bengio · Yoshua Bengio

Hall C 4-9 #1411
[ ] [ Paper PDF ]
[ Slides
Wed 24 Jul 2:30 a.m. PDT — 4 a.m. PDT

Abstract:

GFlowNets are probabilistic models that sequentially generate compositional structures through a stochastic policy. Among GFlowNets, temperature-conditional GFlowNets can introduce temperature-based controllability for exploration and exploitation. We propose Logit-scaling GFlowNets (Logit-GFN), a novel architectural design that greatly accelerates the training of temperature-conditional GFlowNets. It is based on the idea that previously proposed approaches introduced numerical challenges in the deep network training, since different temperatures may give rise to very different gradient profiles as well as magnitudes of the policy's logits. We find that the challenge is greatly reduced if a learned function of the temperature is used to scale the policy's logits directly. Also, using Logit-GFN, GFlowNets can be improved by having better generalization capabilities in offline learning and mode discovery capabilities in online learning, which is empirically verified in various biological and chemical tasks. Our code is available at https://github.com/dbsxodud-11/logit-gfn

Chat is not available.