Oral
Variational Implicit Processes
Chao Ma · Yingzhen Li · Jose Hernandez-Lobato

Wed Jun 12th 02:40 -- 03:00 PM @ Room 101

We introduce the implicit process (IP), a stochastic process that places implicitly defined multivariate distributions over any finite collections of random variables. IPs are therefore highly flexible implicit priors over \emph{functions}, with examples include data simulators, Bayesian neural networks and non-linear transformations of stochastic processes. A novel and efficient function space approximate Bayesian inference algorithm for IPs, namely the variational implicit processes (VIPs), is derived using generalised wake-sleep updates. This method returns simple update equations and allows scalable hyper-parameter learning with stochastic optimization. Experiments demonstrate that VIPs return better uncertainty estimates and superior performance over existing inference methods for challenging models such as Bayesian LSTMs, Bayesian neural networks, and Gaussian processes.

Author Information

Chao Ma (University of Cambridge)
Yingzhen Li (Microsoft Research Cambridge)
Jose Hernandez-Lobato (University of Cambridge)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors