Oral
The Kernel Interaction Trick: Fast Bayesian Discovery of Pairwise Interactions in High Dimensions
Raj Agrawal · Brian Trippe · Jonathan Huggins · Tamara Broderick

Wed Jun 12th 05:10 -- 05:15 PM @ Room 101

Discovering interaction effects on a response of interest is a fundamental problem faced in biology, medicine, economics, and many other scientific disciplines. In theory, Bayesian methods for discovering pairwise interactions enjoy many benefits -- including coherent uncertainty quantification, the ability to incorporate background knowledge, and desirable shrinkage properties. In practice, however, Bayesian methods are often computationally intractable for even moderate-dimensional problems. Our key insight is that many hierarchical models of practical interest admit a Gaussian process representation such that a posterior over all O(p^2) interactions need never be maintained explicitly, only a vector of O(p) kernel hyper-parameters. This implicit representation allows us to run MCMC over model hyper-parameters in time and memory linear in p per iteration. On datasets with a variety of covariate and parameter behaviors such as sparsity, we show that: (1) our method improves running time by orders of magnitude over naive applications of MCMC, (2) that our method offers improved Type I and Type II error relative to state-of-the-art LASSO-based approaches, and (3) that our method offers improved computational scaling in high dimensions relative to existing Bayesian and LASSO-based approaches.

Author Information

Raj Agrawal (MIT)
Brian Trippe (MIT)
Jonathan Huggins (Harvard)
Tamara Broderick (MIT)

Tamara Broderick is the ITT Career Development Assistant Professor in the Department of Electrical Engineering and Computer Science at MIT. She is a member of the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL), the MIT Statistics and Data Science Center, and the Institute for Data, Systems, and Society (IDSS). She completed her Ph.D. in Statistics at the University of California, Berkeley in 2014. Previously, she received an AB in Mathematics from Princeton University (2007), a Master of Advanced Study for completion of Part III of the Mathematical Tripos from the University of Cambridge (2008), an MPhil by research in Physics from the University of Cambridge (2009), and an MS in Computer Science from the University of California, Berkeley (2013). Her recent research has focused on developing and analyzing models for scalable Bayesian machine learning---especially Bayesian nonparametrics. She has been awarded an NSF CAREER Award (2018), a Sloan Research Fellowship (2018), an Army Research Office Young Investigator Program award (2017), a Google Faculty Research Award, the ISBA Lifetime Members Junior Researcher Award, the Savage Award (for an outstanding doctoral dissertation in Bayesian theory and methods), the Evelyn Fix Memorial Medal and Citation (for the Ph.D. student on the Berkeley campus showing the greatest promise in statistical research), the Berkeley Fellowship, an NSF Graduate Research Fellowship, a Marshall Scholarship, and the Phi Beta Kappa Prize (for the graduating Princeton senior with the highest academic average).

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors