Skip to yearly menu bar Skip to main content


Poster

Topic Modeling via Full Dependence Mixtures

Dan Fisher · Mark Kozdoba · Shie Mannor

Keywords: [ Clustering ] [ Dimensionality Reduction ] [ Non-convex Optimization ] [ Recommender Systems ] [ Unsupervised and Semi-supervised Learning ]


Abstract:

In this paper we introduce a new approach to topic modelling that scales to large datasets by using a compact representation of the data and by leveraging the GPU architecture.
In this approach, topics are learned directly from the
co-occurrence data of the corpus. In particular, we introduce a novel mixture model which we term the Full Dependence Mixture (FDM) model. FDMs model second moment under general generative assumptions on the data. While there is previous work on topic modeling using second moments, we develop a direct stochastic optimization procedure for fitting an FDM with a single Kullback Leibler objective. Moment methods in general have the benefit that an iteration no longer needs to scale with the size of the corpus. Our approach allows us to leverage standard optimizers and GPUs for the problem of topic modeling. In particular, we evaluate the approach on two large datasets, NeurIPS papers and a Twitter corpus, with a large number of topics, and show that the approach performs comparably or better than the standard benchmarks.

Chat is not available.