Skip to yearly menu bar Skip to main content


Poster

Nonlinear Stein Variational Gradient Descent for Learning Diversified Mixture Models

Dilin Wang · Qiang Liu

Pacific Ballroom #231

Keywords: [ Approximate Inference ]


Abstract:

Diversification has been shown to be a powerful mechanism for learning robust models in non-convex settings. A notable example is learning mixture models, in which enforcing diversity between the different mixture components allows us to prevent the model collapsing phenomenon and capture more patterns from the observed data. In this work, we present a variational approach for diversity-promoting learning, which leverages the entropy functional as a natural mechanism for enforcing diversity. We develop a simple and efficient functional gradient-based algorithm for optimizing the variational objective function, which provides a significant generalization of Stein variational gradient descent (SVGD). We test our method on various challenging real world problems, including deep embedded clustering and deep anomaly detection. Empirical results show that our method provides an effective mechanism for diversity-promoting learning, achieving substantial improvement over existing methods.

Live content is unavailable. Log in and register to view live content