Skip to yearly menu bar Skip to main content


Poster

Spread Divergence

Mingtian Zhang · Peter Hayes · Thomas Bird · Raza Habib · David Barber

Keywords: [ Unsupervised and Semi-supervised Learning ] [ Unsupervised Learning ] [ Generative Models ] [ Approximate Inference ]


Abstract: For distributions $\mathbb{P}$ and $\mathbb{Q}$ with different supports or undefined densities, the divergence $\textrm{D}(\mathbb{P}||\mathbb{Q})$ may not exist. We define a Spread Divergence $\tilde{\textrm{D}}(\mathbb{P}||\mathbb{Q})$ on modified $\mathbb{P}$ and $\mathbb{Q}$ and describe sufficient conditions for the existence of such a divergence. We demonstrate how to maximize the discriminatory power of a given divergence by parameterizing and learning the spread. We also give examples of using a Spread Divergence to train implicit generative models, including linear models (Independent Components Analysis) and non-linear models (Deep Generative Networks).

Chat is not available.