Timezone: »

Inter-domain Deep Gaussian Processes
Tim G. J. Rudner · Dino Sejdinovic · Yarin Gal

Thu Jul 16 12:00 PM -- 12:45 PM & Thu Jul 16 11:00 PM -- 11:45 PM (PDT) @ Virtual #None

Inter-domain Gaussian processes (GPs) allow for high flexibility and low computational cost when performing approximate inference in GP models. They are particularly suitable for modeling data exhibiting global structure but are limited to stationary covariance functions and thus fail to model non-stationary data effectively. We propose Inter-domain Deep Gaussian Processes, an extension of inter-domain shallow GPs that combines the advantages of inter-domain and deep Gaussian processes (DGPs), and demonstrate how to leverage existing approximate inference methods to perform simple and scalable approximate inference using inter-domain features in DGPs. We assess the performance of our method on a range of regression tasks and demonstrate that it outperforms inter-domain shallow GPs and conventional DGPs on challenging large-scale real-world datasets exhibiting both global structure as well as a high-degree of non-stationarity.

Author Information

Tim G. J. Rudner (University of Oxford)

I am a PhD Candidate in the Department of Computer Science at the University of Oxford, where I conduct research on probabilistic machine learning with Yarin Gal and Yee Whye Teh. My research interests span **Bayesian deep learning**, **variational inference**, and **reinforcement learning**. I am particularly interested in uncertainty quantification in deep learning, reinforcement learning as probabilistic inference, and probabilistic transfer learning. I am also a **Rhodes Scholar** and an **AI Fellow** at Georgetown University's Center for Security and Emerging Technology.

Dino Sejdinovic (University of Oxford)
Yarin Gal (University of Oxford)

More from the Same Authors