Timezone: »
Oral
An Instability in Variational Inference for Topic Models
Behrooz Ghorbani · Hamidreza Hakim Javadi · Andrea Montanari
Naive mean field variational methods are the state of-the-art approach to inference in topic modeling. We show that these methods suffer from an instability that can produce misleading conclusions. Namely, for certain regimes of the model parameters, variational inference outputs a non-trivial decomposition into topics. However -for the same parameter values- the data contain no actual information about the true topic decomposition, and the output of the algorithm is uncorrelated with it. In particular, the estimated posterior mean is wrong, and estimated credible regions do not achieve the nominal coverage. We discuss how this instability is remedied by more accurate mean field approximations.
Author Information
Behrooz Ghorbani (Stanford University)
Hamidreza Hakim Javadi (Rice University)
Andrea Montanari (Stanford University)
Related Events (a corresponding poster, oral, or spotlight)
-
2019 Poster: An Instability in Variational Inference for Topic Models »
Thu. Jun 13th 01:30 -- 04:00 AM Room Pacific Ballroom
More from the Same Authors
-
2021 : The generalization behavior of random feature and neural tangent models »
Andrea Montanari -
2019 : Linearized two-layers neural networks in high dimension »
Andrea Montanari -
2019 Poster: An Investigation into Neural Net Optimization via Hessian Eigenvalue Density »
Behrooz Ghorbani · Shankar Krishnan · Ying Xiao -
2019 Oral: An Investigation into Neural Net Optimization via Hessian Eigenvalue Density »
Behrooz Ghorbani · Shankar Krishnan · Ying Xiao