Timezone: »
Neural networks are increasingly being used to solve partial differential equations (PDEs), replacing slower numerical solvers. However, a critical issue is that neural PDE solvers require high-quality ground truth data, which usually must come from the very solvers they are designed to replace. Thus, we are presented with a proverbial chicken-and-egg problem. In this paper, we present a method, which can partially alleviate this problem, by improving neural PDE solver sample complexity---Lie point symmetry data augmentation (LPSDA). In the context of PDEs, it turns out we are able to quantitatively derive an exhaustive list of data transformations, based on the Lie point symmetry group of the PDEs in question, something not possible in other application areas. We present this framework and demonstrate how it can easily be deployed to improve neural PDE solver sample complexity by an order of magnitude.
Author Information
Johannes Brandstetter (Microsoft Research)
Max Welling (University of Amsterdam)
Prof. Dr. Max Welling is a research chair in Machine Learning at the University of Amsterdam and a VP Technologies at Qualcomm. He has a secondary appointment as a senior fellow at the Canadian Institute for Advanced Research (CIFAR). He is co-founder of “Scyfer BV” a university spin-off in deep learning which got acquired by Qualcomm in summer 2017. In the past he held postdoctoral positions at Caltech (’98-’00), UCL (’00-’01) and the U. Toronto (’01-’03). He received his PhD in ’98 under supervision of Nobel laureate Prof. G. 't Hooft. Max Welling has served as associate editor in chief of IEEE TPAMI from 2011-2015 (impact factor 4.8). He serves on the board of the NIPS foundation since 2015 (the largest conference in machine learning) and has been program chair and general chair of NIPS in 2013 and 2014 respectively. He was also program chair of AISTATS in 2009 and ECCV in 2016 and general chair of MIDL 2018. He has served on the editorial boards of JMLR and JML and was an associate editor for Neurocomputing, JCGS and TPAMI. He received multiple grants from Google, Facebook, Yahoo, NSF, NIH, NWO and ONR-MURI among which an NSF career grant in 2005. He is recipient of the ECCV Koenderink Prize in 2010. Welling is in the board of the Data Science Research Center in Amsterdam, he directs the Amsterdam Machine Learning Lab (AMLAB), and co-directs the Qualcomm-UvA deep learning lab (QUVA) and the Bosch-UvA Deep Learning lab (DELTA). Max Welling has over 200 scientific publications in machine learning, computer vision, statistics and physics.
Daniel Worrall (DeepMind)
Related Events (a corresponding poster, oral, or spotlight)
-
2022 Poster: Lie Point Symmetry Data Augmentation for Neural PDE Solvers »
Thu. Jul 21st through Fri the 22nd Room Hall E #436
More from the Same Authors
-
2023 Workshop: Structured Probabilistic Inference and Generative Modeling »
Dinghuai Zhang · Yuanqi Du · Chenlin Meng · Shawn Tan · Yingzhen Li · Max Welling · Yoshua Bengio -
2022 Poster: Align-RUDDER: Learning From Few Demonstrations by Reward Redistribution »
Vihang Patil · Markus Hofmarcher · Marius-Constantin Dinu · Matthias Dorfer · Patrick Blies · Johannes Brandstetter · Jose A. Arjona-Medina · Sepp Hochreiter -
2022 Oral: Align-RUDDER: Learning From Few Demonstrations by Reward Redistribution »
Vihang Patil · Markus Hofmarcher · Marius-Constantin Dinu · Matthias Dorfer · Patrick Blies · Johannes Brandstetter · Jose A. Arjona-Medina · Sepp Hochreiter -
2021 Test Of Time: Test of Time Award »
Max Welling · Max Welling -
2021 Poster: A Practical Method for Constructing Equivariant Multilayer Perceptrons for Arbitrary Matrix Groups »
Marc Finzi · Max Welling · Andrew Wilson -
2021 Oral: A Practical Method for Constructing Equivariant Multilayer Perceptrons for Arbitrary Matrix Groups »
Marc Finzi · Max Welling · Andrew Wilson -
2021 : Talk 2 »
Johannes Brandstetter -
2021 Expo Talk Panel: Unique Research Opportunities in AI Algorithms, Health, Traffic, and Weather »
Johannes Brandstetter · Sepp Hochreiter · Michael Kopp · David P Kreil · Alina Mihai -
2019 Workshop: Learning and Reasoning with Graph-Structured Representations »
Ethan Fetaya · Zhiting Hu · Thomas Kipf · Yujia Li · Xiaodan Liang · Renjie Liao · Raquel Urtasun · Hao Wang · Max Welling · Eric Xing · Richard Zemel -
2019 Poster: Stochastic Beams and Where To Find Them: The Gumbel-Top-k Trick for Sampling Sequences Without Replacement »
Wouter Kool · Herke van Hoof · Max Welling -
2019 Oral: Stochastic Beams and Where To Find Them: The Gumbel-Top-k Trick for Sampling Sequences Without Replacement »
Wouter Kool · Herke van Hoof · Max Welling -
2018 Invited Talk: Intelligence per Kilowatthour »
Max Welling -
2018 Poster: BOCK : Bayesian Optimization with Cylindrical Kernels »
ChangYong Oh · Efstratios Gavves · Max Welling -
2018 Oral: BOCK : Bayesian Optimization with Cylindrical Kernels »
ChangYong Oh · Efstratios Gavves · Max Welling -
2017 Poster: Multiplicative Normalizing Flows for Variational Bayesian Neural Networks »
Christos Louizos · Max Welling -
2017 Talk: Multiplicative Normalizing Flows for Variational Bayesian Neural Networks »
Christos Louizos · Max Welling