Timezone: »
Gaussian processes are the gold standard for many real-world modeling problems, especially in cases where a model's success hinges upon its ability to faithfully represent predictive uncertainty. These problems typically exist as parts of larger frameworks, wherein quantities of interest are ultimately defined by integrating over posterior distributions. These quantities are frequently intractable, motivating the use of Monte Carlo methods. Despite substantial progress in scaling up Gaussian processes to large training sets, methods for accurately generating draws from their posterior distributions still scale cubically in the number of test locations. We identify a decomposition of Gaussian processes that naturally lends itself to scalable sampling by separating out the prior from the data. Building off of this factorization, we propose an easy-to-use and general-purpose approach for fast posterior sampling, which seamlessly pairs with sparse approximations to afford scalability both during training and at test time. In a series of experiments designed to test competing sampling schemes' statistical properties and practical ramifications, we demonstrate how decoupled sample paths accurately represent Gaussian process posteriors at a fraction of the usual cost.
Author Information
James Wilson (Imperial College London)
Viacheslav Borovitskiy (St. Petersburg Department of Steklov Mathematical Institute of Russian Academy of Sciences (PDMI RAS))
Alexander Terenin (Imperial College London)
Peter Mostowsky (Saint Petersburg State University)
Marc Deisenroth (University College London)
More from the Same Authors
-
2022 : Faster Training of Neural ODEs Using Gauß–Legendre Quadrature »
Alexander Norcliffe · Marc Deisenroth -
2022 Workshop: Continuous Time Perspectives in Machine Learning »
Mihaela Rosca · Chongli Qin · Julien Mairal · Marc Deisenroth -
2020 Poster: Stochastic Differential Equations with Variational Wishart Diffusions »
Martin Jørgensen · Marc Deisenroth · Hugh Salimbeni -
2020 Poster: Healing Products of Gaussian Process Experts »
samuel cohen · Rendani Mbuvha · Tshilidzi Marwala · Marc Deisenroth -
2019 Poster: Deep Gaussian Processes with Importance-Weighted Variational Inference »
Hugh Salimbeni · Vincent Dutordoir · James Hensman · Marc P Deisenroth -
2019 Oral: Deep Gaussian Processes with Importance-Weighted Variational Inference »
Hugh Salimbeni · Vincent Dutordoir · James Hensman · Marc P Deisenroth -
2018 Poster: Design of Experiments for Model Discrimination Hybridising Analytical and Data-Driven Approaches »
Simon Olofsson · Marc P Deisenroth · Ruth Misener -
2018 Oral: Design of Experiments for Model Discrimination Hybridising Analytical and Data-Driven Approaches »
Simon Olofsson · Marc P Deisenroth · Ruth Misener