Timezone: »

 
Poster
Scalable Variational Gaussian Processes via Harmonic Kernel Decomposition
Shengyang Sun · Jiaxin Shi · Andrew Wilson · Roger Grosse

Thu Jul 22 09:00 AM -- 11:00 AM (PDT) @ None #None

We introduce a new scalable variational Gaussian process approximation which provides a high fidelity approximation while retaining general applicability. We propose the harmonic kernel decomposition (HKD), which uses Fourier series to decompose a kernel as a sum of orthogonal kernels. Our variational approximation exploits this orthogonality to enable a large number of inducing points at a low computational cost. We demonstrate that, on a range of regression and classification problems, our approach can exploit input space symmetries such as translations and reflections, and it significantly outperforms standard variational methods in scalability and accuracy. Notably, our approach achieves state-of-the-art results on CIFAR-10 among pure GP models.

Author Information

Shengyang Sun (University of Toronto)
Jiaxin Shi (Microsoft Research)
Andrew Wilson (New York University)
Andrew Wilson

Andrew Gordon Wilson is faculty in the Courant Institute and Center for Data Science at NYU. His interests include probabilistic modelling, Gaussian processes, Bayesian statistics, physics inspired machine learning, and loss surfaces and generalization in deep learning. His webpage is https://cims.nyu.edu/~andrewgw.

Roger Grosse (University of Toronto and Vector Institute)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors