Skip to yearly menu bar Skip to main content


Boosted Density Estimation Remastered

Zac Cranko · Richard Nock

Pacific Ballroom #161

Keywords: [ Statistical Learning Theory ] [ Optimization - Others ] [ Information Theory and Estimation ] [ Generative Adversarial Networks ] [ Bayesian Methods ]

Abstract: There has recently been a steady increase in the number iterative approaches to density estimation. However, an accompanying burst of formal convergence guarantees has not followed; all results pay the price of heavy assumptions which are often unrealistic or hard to check. The \emph{Generative Adversarial Network (GAN)} literature --- seemingly orthogonal to the aforementioned pursuit --- has had the side effect of a renewed interest in variational divergence minimisation (notably $f$-GAN). We show how to combine this latter approach and the classical boosting theory in supervised learning to get the first density estimation algorithm that provably achieves geometric convergence under very weak assumptions. We do so by a trick allowing to combine \textit{classifiers} as the sufficient statistics of an exponential family. Our analysis includes an improved variational characterisation of $f$-GAN.

Live content is unavailable. Log in and register to view live content