Timezone: »
We introduce a kernel approximation strategy that enables computation of the Gaussian process log marginal likelihood and all hyperparameter derivatives in O(p) time. Our GRIEF kernel consists of p eigenfunctions found using a Nyström approximation from a dense Cartesian product grid of inducing points. By exploiting algebraic properties of Kronecker and Khatri-Rao tensor products, computational complexity of the training procedure can be practically independent of the number of inducing points. This allows us to use arbitrarily many inducing points to achieve a globally accurate kernel approximation, even in high-dimensional problems. The fast likelihood evaluation enables type-I or II Bayesian inference on large-scale datasets. We benchmark our algorithms on real-world problems with up to two-million training points and 10^33 inducing points.
Author Information
Trefor Evans (University of Toronto)
Prasanth B Nair (University of Toronto)
Related Events (a corresponding poster, oral, or spotlight)
-
2018 Oral: Scalable Gaussian Processes with Grid-Structured Eigenfunctions (GP-GRIEF) »
Fri Jul 13th 09:00 -- 09:20 AM Room A4