Timezone: »
We describe a framework for constructing nonstationary nonseparable random fields based on an infinite mixture of convolved stochastic processes. When the mixing process is stationary but the convolution function is nonstationary we arrive at nonseparable kernels with constant non-separability that are available in closed form. When the mixing is nonstationary and the convolution function is stationary we arrive at nonseparable random fields that have varying nonseparability and better preserve local structure. These fields have natural interpretations through the spectral representation of stochastic differential equations (SDEs) and are demonstrated on a range of synthetic benchmarks and spatio-temporal applications in geostatistics and machine learning. We show how a single Gaussian process (GP) with these random fields can computationally and statistically outperform both separable and existing nonstationary nonseparable approaches such as treed GPs and deep GP constructions.
Author Information
Kangrui Wang (The Alan Turing Institute)
Oliver Hamelijnck (The Alan Turing Institute)
Theodoros Damoulas (University of Warwick & The Alan Turing Institute)
Mark Steel (University of Warwick)
More from the Same Authors
-
2021 Poster: SigGPDE: Scaling Sparse Gaussian Processes on Sequential Data »
Maud Lemercier · Cristopher Salvi · Thomas Cass · Edwin V Bonilla · Theodoros Damoulas · Terry Lyons -
2021 Spotlight: SigGPDE: Scaling Sparse Gaussian Processes on Sequential Data »
Maud Lemercier · Cristopher Salvi · Thomas Cass · Edwin V Bonilla · Theodoros Damoulas · Terry Lyons -
2018 Poster: Spatio-temporal Bayesian On-line Changepoint Detection with Model Selection »
Jeremias Knoblauch · Theodoros Damoulas -
2018 Oral: Spatio-temporal Bayesian On-line Changepoint Detection with Model Selection »
Jeremias Knoblauch · Theodoros Damoulas