Timezone: »
In this work, we propose a novel prior learning method for advancing generalization and uncertainty estimation in deep neural networks. The key idea is to exploit scalable and structured posteriors of neural networks as informative priors with generalization guarantees. Our learned priors provide expressive probabilistic representations at large scale, like Bayesian counterparts of pre-trained models on ImageNet, and further produce non-vacuous generalization bounds. We also extend this idea to a continual learning framework, where the favorable properties of our priors are desirable. Major enablers are our technical contributions: (1) the sums-of-Kronecker-product computations, and (2) the derivations and optimizations of tractable objectives that lead to improved generalization bounds. Empirically, we exhaustively show the effectiveness of this method for uncertainty estimation and generalization.
Author Information
Dominik Schnaus (Technical University of Munich)
Jongseok Lee (German Aerospace Center (DLR))
Daniel Cremers (TU Munich)
Rudolph Triebel (German Aerospace Center (DLR))
More from the Same Authors
-
2023 Poster: Beyond In-Domain Scenarios: Robust Density-Aware Calibration »
Christian Tomani · Futa Waseda · Yuesong Shen · Daniel Cremers -
2021 Poster: Variational Data Assimilation with a Learned Inverse Observation Operator »
Thomas Frerix · Dmitrii Kochkov · Jamie Smith · Daniel Cremers · Michael Brenner · Stephan Hoyer -
2021 Spotlight: Variational Data Assimilation with a Learned Inverse Observation Operator »
Thomas Frerix · Dmitrii Kochkov · Jamie Smith · Daniel Cremers · Michael Brenner · Stephan Hoyer -
2020 : Paper spotlight: Learning Multiplicative Interactions with Bayesian Neural Networks for Visual-Inertial Odometry »
Kashmira Shinde · Jongseok Lee -
2020 : Panel Discussion 1 »
Daniel Cremers · Nemanja Djuric · Ingmar Posner · Dariu Gavrila -
2020 : Q&A: Daniel Cremers »
Daniel Cremers -
2020 : Invited Talk: Deep Direct Visual SLAM (Daniel Cremers) »
Daniel Cremers -
2020 Poster: Estimating Model Uncertainty of Neural Networks in Sparse Information Form »
Jongseok Lee · Matthias Humt · Jianxiang Feng · Rudolph Triebel -
2019 Poster: Flat Metric Minimization with Applications in Generative Modeling »
Thomas Möllenhoff · Daniel Cremers -
2019 Oral: Flat Metric Minimization with Applications in Generative Modeling »
Thomas Möllenhoff · Daniel Cremers