Timezone: »

 
Workshop
INNF+: Invertible Neural Networks, Normalizing Flows, and Explicit Likelihood Models
Chin-Wei Huang · David Krueger · Rianne Van den Berg · George Papamakarios · Tian Qi Chen · Danilo J. Rezende

Fri Jul 23 02:28 AM -- 11:30 AM (PDT) @ None
Event URL: https://invertibleworkshop.github.io/ »

Normalizing flows are explicit likelihood models (ELM) characterized by a flexible invertible reparameterization of high-dimensional probability distributions. Unlike other ELMs, they offer both exact and efficient likelihood computation and data generation. Since their recent introduction, flow-based models have seen a significant resurgence of interest in the machine learning community. As a result, powerful flow-based models have been developed, with successes in density estimation, variational inference, and generative modeling of images, audio and video.

As the field is moving forward, the main goal of the workshop is to consolidate recent progress and connect ideas from related fields. Over the past few years, we’ve seen that normalizing flows are deeply connected to latent variable models, autoregressive models, and more recently, diffusion-based generative models. This year, we would like to further push the forefront of these explicit likelihood models through the lens of invertible reparameterization. We encourage researchers to use these models in conjunction to exploit the their benefits at once, and to work together to resolve some common issues of likelihood-based methods, such as mis-calibration of out-of-distribution uncertainty.

Fri 2:28 a.m. - 2:30 a.m.
Opening (Talk)   
Fri 2:30 a.m. - 2:55 a.m.
Invited Talk 1 (Charline Le Lan): On the use of density models for anomaly detection (Talk)   
Charline Le Lan
Fri 2:55 a.m. - 3:00 a.m.
Q&A (Charline Le Lan) (Q&A)
Fri 3:00 a.m. - 3:25 a.m.
Invited Talk 2 (Yingzhen Li): Inference with scores: slices, diffusions and flows (Talk)   
Yingzhen Li
Fri 3:25 a.m. - 3:30 a.m.
Q&A (Yingzhen Li) (Q&A)
Fri 3:30 a.m. - 3:35 a.m.
Spotlight 1: Distilling the Knowledge from Normalizing Flows (Talk)   
Invertible Workshop INNF
Fri 3:35 a.m. - 3:40 a.m.
Spotlight 2: Why be adversarial? Let's cooperate!: Cooperative Dataset Alignment via JSD Upper Bound (Talk)   
Invertible Workshop INNF
Fri 3:40 a.m. - 3:45 a.m.
Spotlight 3: Representational aspects of depth and conditioning in normalizing flows (Talk)   
Invertible Workshop INNF
Fri 3:45 a.m. - 3:50 a.m.
Spotlight 4: Rectangular Flows for Manifold Learning (Talk)   
Invertible Workshop INNF
Fri 3:50 a.m. - 3:55 a.m.
Spotlight 5: Interpreting diffusion score matching using normalizing flow (Talk)   
Invertible Workshop INNF
Fri 3:55 a.m. - 4:00 a.m.
Spotlight 6: Universal Approximation using Well-conditioned Normalizing Flows (Talk)   
Invertible Workshop INNF
Fri 4:00 a.m. - 5:00 a.m.

Poster room 1:

https://eventhosts.gather.town/0fH1RU147QI1cqPq/innf-poster-room-1

Poster room 2:

https://eventhosts.gather.town/Q6X3qZkT5TMp3HPu/innf-poster-room-2

When and Where:

https://docs.google.com/spreadsheets/u/1/d/1l1hA6IyEDLkzNMQuO2BLtsLWAI05dEC0R5lxNPJriMY/edit#gid=0

Fri 4:59 a.m. - 5:00 a.m.
Intro (Talk)
Fri 5:00 a.m. - 5:25 a.m.
Invited Talk 3 (Phiala Shanahan): Flow models for theoretical particle and nuclear physics (Talk)   
Phiala Shanahan
Fri 5:25 a.m. - 5:30 a.m.
Q&A (Phiala Shanahan) (Q&A)
Fri 5:30 a.m. - 5:55 a.m.
Invited Talk 4 (Marcus Brubaker): Wavelet Flow: Fast Training of High Resolution Normalizing Flows (Talk)   
Marcus A Brubaker
Fri 5:55 a.m. - 6:00 a.m.
Q&A (Marcus Brubaker) (Q&A)
Fri 6:00 a.m. - 7:30 a.m.
Break
Fri 7:29 a.m. - 7:30 a.m.
Intro (talk)
Fri 7:30 a.m. - 7:55 a.m.
Invited Talk 5 (Stefano Ermon): Maximum Likelihood Training of Score-Based Diffusion Models (Talk)   
Stefano Ermon
Fri 7:55 a.m. - 8:00 a.m.
Q&A (Stefano Ermon) (Q&A)
Fri 8:00 a.m. - 8:25 a.m.
Contributed Talk 1: Diffeomorphic Explanations with Normalizing Flows (Talk)   
Invertible Workshop INNF
Fri 8:25 a.m. - 8:30 a.m.
Q&A (Ann-Kathrin Dombrowski) (Q&A)
Fri 8:30 a.m. - 8:55 a.m.
Invited Talk 6 (Maximilian Nickel): Modeling Spatio-Temporal Events via Normalizing Flows (Talk)   
Max Nickel
Fri 8:55 a.m. - 9:00 a.m.
Q&A (Maximilian Nickel) (Q&A)
Fri 9:00 a.m. - 9:25 a.m.
Invited Talk 7 (Aditya Ramesh): Scaling up generative models (Talk)   
Aditya Ramesh
Fri 9:25 a.m. - 9:30 a.m.
Q&A (Aditya Ramesh) (Q&A)
Fri 9:30 a.m. - 9:55 a.m.
Contributed Talk 2: Efficient Bayesian Sampling Using Normalizing Flows to Assist Markov Chain Monte Carlo Methods (Talk)   
Invertible Workshop INNF
Fri 9:55 a.m. - 10:00 a.m.
Q&A (Marylou Gabrié) (Q&A)
Fri 10:00 a.m. - 10:05 a.m.
Spotlight 7: Sliced Iterative Normalizing Flows (Talk)   
Invertible Workshop INNF
Fri 10:05 a.m. - 10:10 a.m.
Spotlight 8: Universal Approximation of Residual Flows in Maximum Mean Discrepancy (Talk)   
Invertible Workshop INNF
Fri 10:10 a.m. - 10:15 a.m.
Spotlight 9: On Fast Sampling of Diffusion Probabilistic Models (Talk)   
Invertible Workshop INNF
Fri 10:15 a.m. - 10:20 a.m.
Spotlight 10: Discrete Denoising Flows (Talk)   
Invertible Workshop INNF
Fri 10:20 a.m. - 10:25 a.m.
Spotlight 11: Task-agnostic Continual Learning with Hybrid Probabilistic Models (Talk)   
Invertible Workshop INNF
Fri 10:25 a.m. - 10:30 a.m.
Spotlight 12: Conformal Embedding Flows: Tractable Density Estimation on Learned Manifolds (Talk)   
Invertible Workshop INNF
Fri 10:30 a.m. - 11:30 a.m.

Poster room 1:

https://eventhosts.gather.town/0fH1RU147QI1cqPq/innf-poster-room-1

Poster room 2:

https://eventhosts.gather.town/Q6X3qZkT5TMp3HPu/innf-poster-room-2

When and Where:

https://docs.google.com/spreadsheets/u/1/d/1l1hA6IyEDLkzNMQuO2BLtsLWAI05dEC0R5lxNPJriMY/edit#gid=0

Author Information

Chin-Wei Huang (MILA)
David Krueger (Université de Montréal)
Rianne Van den Berg (University of Amsterdam)
George Papamakarios (DeepMind)
Ricky T. Q. Chen (U of Toronto)
Danilo J. Rezende (DeepMind)
Danilo J. Rezende

Danilo is a Senior Staff Research Scientist at Google DeepMind, where he works on probabilistic machine reasoning and learning algorithms. He has a BA in Physics and MSc in Theoretical Physics from Ecole Polytechnique (Palaiseau – France) and from the Institute of Theoretical Physics (SP – Brazil) and a Ph.D. in Computational Neuroscience at Ecole Polytechnique Federale de Lausanne, EPFL (Lausanne – Switzerland). His research focuses on scalable inference methods, generative models of complex data (such as images and video), applied probability, causal reasoning and unsupervised learning for decision-making.

More from the Same Authors