Skip to yearly menu bar Skip to main content


Poster

Variational Russian Roulette for Deep Bayesian Nonparametrics

Kai Xu · Akash Srivastava · Charles Sutton

Pacific Ballroom #223

Keywords: [ Bayesian Nonparametrics ] [ Approximate Inference ]


Abstract:

Bayesian nonparametric models provide a principled way to automatically adapt the complexity of a model to the amount of the data available, but computation in such models is difficult. Amortized variational approximations are appealing because of their computational efficiency, but current methods rely on a fixed finite truncation of the infinite model. This truncation level can be difficult to set, and also interacts poorly with amortized methods due to the over-pruning problem. Instead, we propose a new variational approximation, based on a method from statistical physics called Russian roulette sampling. This allows the variational distribution to adapt its complexity during inference, without relying on a fixed truncation level, and while still obtaining an unbiased estimate of the gradient of the original variational objective. We demonstrate this method on infinite sized variational auto-encoders using a Beta-Bernoulli (Indian buffet process) prior.

Live content is unavailable. Log in and register to view live content