Efficient Approximate Inference for Stationary Kernel on Frequency Domain

Yohan Jung · Kyungwoo Song · Jinkyoo Park

Hall E #814

Keywords: [ PM: Variational Inference ] [ PM: Bayesian Models and Methods ] [ PM: Monte Carlo and Sampling Methods ] [ T: Probabilistic Methods ] [ PM: Gaussian Processes ]

[ Abstract ]
[ Poster [ Paper PDF
Wed 20 Jul 3:30 p.m. PDT — 5:30 p.m. PDT
Spotlight presentation: PM: Bayesian Models and Methods
Wed 20 Jul 10:15 a.m. PDT — 11:45 a.m. PDT


Based on the Fourier duality between a stationary kernel and its spectral density, modeling the spectral density using a Gaussian mixture density enables one to construct a flexible kernel, known as a Spectral Mixture kernel, that can model any stationary kernel. However, despite its expressive power, training this kernel is typically difficult because scalability and overfitting issues often arise due to a large number of training parameters. To resolve these issues, we propose an approximate inference method for estimating the Spectral mixture kernel hyperparameters. Specifically, we approximate this kernel by using the finite random spectral points based on Random Fourier Feature and optimize the parameters for the distribution of spectral points by sampling-based variational inference. To improve this inference procedure, we analyze the training loss and propose two special methods: a sampling method of spectral points to reduce the error of the approximate kernel in training, and an approximate natural gradient to accelerate the convergence of parameter inference.

Chat is not available.