Timezone: »

Variational nearest neighbor Gaussian process
Luhuan Wu · Geoff Pleiss · John Cunningham

Tue Jul 19 03:30 PM -- 05:30 PM (PDT) @ Hall E #739
Variational approximations to Gaussian processes (GPs) typically use a small set of inducing points to form a low-rank approximation to the covariance matrix. In this work, we instead exploit a sparse approximation of the precision matrix. We propose variational nearest neighbor Gaussian process (VNNGP), which introduces a prior that only retains correlations within $K$ nearest-neighboring observations, thereby inducing sparse precision structure. Using the variational framework, VNNGP's objective can be factorized over both observations and inducing points, enabling stochastic optimization with a time complexity of $O(K^3)$. Hence, we can arbitrarily scale the inducing point size, even to the point of putting inducing points at every observed location. We compare VNNGP to other scalable GPs through various experiments, and demonstrate that VNNGP (1) can dramatically outperform low-rank methods, and (2) is less prone to overfitting than other nearest neighbor methods.

Author Information

Luhuan Wu (Columbia University)
Geoff Pleiss (Columbia University)
John Cunningham (Columbia University)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors