Timezone: »
Stochastic differential equations are an important modeling class in many disciplines. Consequently, there exist many methods relying on various discretization and numerical integration schemes. In this paper, we propose a novel, probabilistic model for estimating the drift and diffusion given noisy observations of the underlying stochastic system. Using state-of-the-art adversarial and moment matching inference techniques, we avoid the discretization schemes of classical approaches. This leads to significant improvements in parameter accuracy and robustness given random initial guesses. On four commonly used benchmark systems, we demonstrate the performance of our algorithms compared to state-of-the-art solutions based on extended Kalman filtering and Gaussian processes.
Author Information
Gabriele Abbati (University of Oxford)
Philippe Wenk (ETH Zurich)
Michael A Osborne (U Oxford)
Andreas Krause (ETH Zurich)
Andreas Krause is a Professor of Computer Science at ETH Zurich, where he leads the Learning & Adaptive Systems Group. He also serves as Academic Co-Director of the Swiss Data Science Center. Before that he was an Assistant Professor of Computer Science at Caltech. He received his Ph.D. in Computer Science from Carnegie Mellon University (2008) and his Diplom in Computer Science and Mathematics from the Technical University of Munich, Germany (2004). He is a Microsoft Research Faculty Fellow and a Kavli Frontiers Fellow of the US National Academy of Sciences. He received ERC Starting Investigator and ERC Consolidator grants, the Deutscher Mustererkennungspreis, an NSF CAREER award, the Okawa Foundation Research Grant recognizing top young researchers in telecommunications as well as the ETH Golden Owl teaching award. His research on machine learning and adaptive systems has received awards at several premier conferences and journals, including the ACM SIGKDD Test of Time award 2019 and the ICML Test of Time award 2020. Andreas Krause served as Program Co-Chair for ICML 2018, and is regularly serving as Area Chair or Senior Program Committee member for ICML, NeurIPS, AAAI and IJCAI, and as Action Editor for the Journal of Machine Learning Research.
Bernhard Schölkopf (Max Planck Institute for Intelligent Systems)
Stefan Bauer (MPI for Intelligent Systems)
Related Events (a corresponding poster, oral, or spotlight)
-
2019 Poster: AReS and MaRS - Adversarial and MMD-Minimizing Regression for SDEs »
Thu Jun 13th 01:30 -- 04:00 AM Room Pacific Ballroom
More from the Same Authors
-
2020 Poster: From Sets to Multisets: Provable Variational Inference for Probabilistic Integer Submodular Models »
Aytunc Sahin · Yatao Bian · Joachim Buhmann · Andreas Krause -
2020 Poster: Knowing The What But Not The Where in Bayesian Optimization »
Vu Nguyen · Michael A Osborne -
2020 Poster: Bayesian Optimisation over Multiple Continuous and Categorical Inputs »
Binxin Ru · Ahsan Alvi · Vu Nguyen · Michael A Osborne · Stephen Roberts -
2020 Test Of Time: Test of Time: Gaussian Process Optimization in the Bandit Settings: No Regret and Experimental Design »
Niranjan Srinivas · Andreas Krause · Sham Kakade · Matthias W Seeger -
2019 Poster: Online Variance Reduction with Mixtures »
Zalán Borsos · Sebastian Curi · Yehuda Levy · Andreas Krause -
2019 Poster: Adaptive and Safe Bayesian Optimization in High Dimensions via One-Dimensional Subspaces »
Johannes Kirschner · Mojmir Mutny · Nicole Hiller · Rasmus Ischebeck · Andreas Krause -
2019 Poster: Robustly Disentangled Causal Mechanisms: Validating Deep Representations for Interventional Robustness »
Raphael Suter · Djordje Miladinovic · Bernhard Schölkopf · Stefan Bauer -
2019 Poster: On the Limitations of Representing Functions on Sets »
Edward Wagstaff · Fabian Fuchs · Martin Engelcke · Ingmar Posner · Michael A Osborne -
2019 Oral: Robustly Disentangled Causal Mechanisms: Validating Deep Representations for Interventional Robustness »
Raphael Suter · Djordje Miladinovic · Bernhard Schölkopf · Stefan Bauer -
2019 Oral: Adaptive and Safe Bayesian Optimization in High Dimensions via One-Dimensional Subspaces »
Johannes Kirschner · Mojmir Mutny · Nicole Hiller · Rasmus Ischebeck · Andreas Krause -
2019 Oral: On the Limitations of Representing Functions on Sets »
Edward Wagstaff · Fabian Fuchs · Martin Engelcke · Ingmar Posner · Michael A Osborne -
2019 Oral: Online Variance Reduction with Mixtures »
Zalán Borsos · Sebastian Curi · Yehuda Levy · Andreas Krause -
2019 Poster: Learning Generative Models across Incomparable Spaces »
Charlotte Bunne · David Alvarez-Melis · Andreas Krause · Stefanie Jegelka -
2019 Poster: Automated Model Selection with Bayesian Quadrature »
Henry Chai · Jean-Francois Ton · Michael A Osborne · Roman Garnett -
2019 Poster: Asynchronous Batch Bayesian Optimisation with Improved Local Penalisation »
Ahsan Alvi · Binxin Ru · Jan-Peter Calliess · Stephen Roberts · Michael A Osborne -
2019 Oral: Learning Generative Models across Incomparable Spaces »
Charlotte Bunne · David Alvarez-Melis · Andreas Krause · Stefanie Jegelka -
2019 Oral: Automated Model Selection with Bayesian Quadrature »
Henry Chai · Jean-Francois Ton · Michael A Osborne · Roman Garnett -
2019 Oral: Asynchronous Batch Bayesian Optimisation with Improved Local Penalisation »
Ahsan Alvi · Binxin Ru · Jan-Peter Calliess · Stephen Roberts · Michael A Osborne -
2019 Poster: Fingerprint Policy Optimisation for Robust Reinforcement Learning »
Supratik Paul · Michael A Osborne · Shimon Whiteson -
2019 Poster: Optimal Continuous DR-Submodular Maximization and Applications to Provable Mean Field Inference »
Yatao Bian · Joachim Buhmann · Andreas Krause -
2019 Poster: Challenging Common Assumptions in the Unsupervised Learning of Disentangled Representations »
Francesco Locatello · Stefan Bauer · Mario Lucic · Gunnar Ratsch · Sylvain Gelly · Bernhard Schölkopf · Olivier Bachem -
2019 Oral: Fingerprint Policy Optimisation for Robust Reinforcement Learning »
Supratik Paul · Michael A Osborne · Shimon Whiteson -
2019 Oral: Optimal Continuous DR-Submodular Maximization and Applications to Provable Mean Field Inference »
Yatao Bian · Joachim Buhmann · Andreas Krause -
2019 Oral: Challenging Common Assumptions in the Unsupervised Learning of Disentangled Representations »
Francesco Locatello · Stefan Bauer · Mario Lucic · Gunnar Ratsch · Sylvain Gelly · Bernhard Schölkopf · Olivier Bachem -
2018 Poster: Fast Information-theoretic Bayesian Optimisation »
Binxin Ru · Michael A Osborne · Mark Mcleod · Diego Granziol -
2018 Poster: Optimization, fast and slow: optimally switching between local and Bayesian optimization »
Mark McLeod · Stephen Roberts · Michael A Osborne -
2018 Oral: Optimization, fast and slow: optimally switching between local and Bayesian optimization »
Mark McLeod · Stephen Roberts · Michael A Osborne -
2018 Oral: Fast Information-theoretic Bayesian Optimisation »
Binxin Ru · Michael A Osborne · Mark Mcleod · Diego Granziol -
2017 Poster: Guarantees for Greedy Maximization of Non-submodular Functions with Applications »
Yatao Bian · Joachim Buhmann · Andreas Krause · Sebastian Tschiatschek -
2017 Poster: Differentially Private Submodular Maximization: Data Summarization in Disguise »
Marko Mitrovic · Mark Bun · Andreas Krause · Amin Karbasi -
2017 Poster: Deletion-Robust Submodular Maximization: Data Summarization with "the Right to be Forgotten" »
Baharan Mirzasoleiman · Amin Karbasi · Andreas Krause -
2017 Poster: Probabilistic Submodular Maximization in Sub-Linear Time »
Serban A Stan · Morteza Zadimoghaddam · Andreas Krause · Amin Karbasi -
2017 Talk: Deletion-Robust Submodular Maximization: Data Summarization with "the Right to be Forgotten" »
Baharan Mirzasoleiman · Amin Karbasi · Andreas Krause -
2017 Talk: Probabilistic Submodular Maximization in Sub-Linear Time »
Serban A Stan · Morteza Zadimoghaddam · Andreas Krause · Amin Karbasi -
2017 Talk: Guarantees for Greedy Maximization of Non-submodular Functions with Applications »
Yatao Bian · Joachim Buhmann · Andreas Krause · Sebastian Tschiatschek -
2017 Talk: Differentially Private Submodular Maximization: Data Summarization in Disguise »
Marko Mitrovic · Mark Bun · Andreas Krause · Amin Karbasi -
2017 Poster: Distributed and Provably Good Seedings for k-Means in Constant Rounds »
Olivier Bachem · Mario Lucic · Andreas Krause -
2017 Poster: Uniform Deviation Bounds for k-Means Clustering »
Olivier Bachem · Mario Lucic · Hamed Hassani · Andreas Krause -
2017 Talk: Uniform Deviation Bounds for k-Means Clustering »
Olivier Bachem · Mario Lucic · Hamed Hassani · Andreas Krause -
2017 Talk: Distributed and Provably Good Seedings for k-Means in Constant Rounds »
Olivier Bachem · Mario Lucic · Andreas Krause