Timezone: »
Understanding the gradient variance of black-box variational inference (BBVI) is a crucial step for establishing its convergence and developing algorithmic improvements. However, existing studies have yet to show that the gradient variance of BBVI satisfies the conditions used to study the convergence of stochastic gradient descent (SGD), the workhorse of BBVI. In this work, we show that BBVI satisfies a matching bound corresponding to the ABC condition used in the SGD literature when applied to smooth and quadratically-growing log-likelihoods. Our results generalize to nonlinear covariance parameterizations widely used in the practice of BBVI. Furthermore, we show that the variance of the mean-field parameterization has provably superior dimensional dependence.
Author Information
Kyurae Kim (University of Pennsylvania)
Kaiwen Wu (University of Pennsylvania)
Jisu Oh (North Carolina State University)
Jacob Gardner (University of Pennsylvania)
Related Events (a corresponding poster, oral, or spotlight)
-
2023 Poster: Practical and Matching Gradient Variance Bounds for Black-Box Variational Bayesian Inference »
Tue. Jul 25th 09:00 -- 11:30 PM Room Exhibit Hall 1 #513
More from the Same Authors
-
2023 : Black Box Adversarial Prompting for Foundation Models »
Natalie Maus · Patrick Chao · Eric Wong · Jacob Gardner