Tutorial
Variational Bayes and Beyond: Bayesian Inference for Big Data
Tamara Broderick

Tue Jul 10th 03:45 -- 06:00 PM @ Victoria
Event URL: http://www.tamarabroderick.com/tutorial_2018_icml.html »

Bayesian methods exhibit a number of desirable properties for modern data analysis---including (1) coherent quantification of uncertainty, (2) a modular modeling framework able to capture complex phenomena, (3) the ability to incorporate prior information from an expert source, and (4) interpretability. In practice, though, Bayesian inference necessitates approximation of a high-dimensional integral, and some traditional algorithms for this purpose can be slow---notably at data scales of current interest. The tutorial will cover modern tools for fast, approximate Bayesian inference at scale. One increasingly popular framework is provided by "variational Bayes" (VB), which formulates Bayesian inference as an optimization problem. We will examine key benefits and pitfalls of using VB in practice, with a focus on the widespread "mean-field variational Bayes" (MFVB) subtype. We will highlight properties that anyone working with VB, from the data analyst to the theoretician, should be aware of. In addition to VB, we will cover recent data summarization techniques for scalable Bayesian inference that come equipped with finite-data theoretical guarantees on quality. We will motivate our exploration throughout with practical data analysis examples and point to a number of open problems in the field.

Webpage: http://www.tamarabroderick.com/tutorial2018icml.html

Author Information

Tamara Broderick (MIT)

Tamara Broderick is the ITT Career Development Assistant Professor in the Department of Electrical Engineering and Computer Science at MIT. She is a member of the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL), the MIT Statistics and Data Science Center, and the Institute for Data, Systems, and Society (IDSS). She completed her Ph.D. in Statistics at the University of California, Berkeley in 2014. Previously, she received an AB in Mathematics from Princeton University (2007), a Master of Advanced Study for completion of Part III of the Mathematical Tripos from the University of Cambridge (2008), an MPhil by research in Physics from the University of Cambridge (2009), and an MS in Computer Science from the University of California, Berkeley (2013). Her recent research has focused on developing and analyzing models for scalable Bayesian machine learning---especially Bayesian nonparametrics. She has been awarded an NSF CAREER Award (2018), a Sloan Research Fellowship (2018), an Army Research Office Young Investigator Program award (2017), a Google Faculty Research Award, the ISBA Lifetime Members Junior Researcher Award, the Savage Award (for an outstanding doctoral dissertation in Bayesian theory and methods), the Evelyn Fix Memorial Medal and Citation (for the Ph.D. student on the Berkeley campus showing the greatest promise in statistical research), the Berkeley Fellowship, an NSF Graduate Research Fellowship, a Marshall Scholarship, and the Phi Beta Kappa Prize (for the graduating Princeton senior with the highest academic average).

More from the Same Authors