In this article, we propose a new class of priors for Bayesian inference with multiple Gaussian graphical models. We introduce Bayesian treatments of two popular procedures, the group graphical lasso and the fused graphical lasso, and extend them to a continuous spike-and-slab framework to allow self-adaptive shrinkage and model selection simultaneously. We develop an EM algorithm that performs fast and dynamic explorations of posterior modes. Our approach selects sparse models efficiently and automatically with substantially smaller bias than would be induced by alternative regularization procedures. The performance of the proposed methods are demonstrated through simulation and two real data examples.
Richard Li (Yale School of Public Health)
Tyler Mccormick (University of Washington)
Samuel Clark (The Ohio State University)
Related Events (a corresponding poster, oral, or spotlight)
2019 Oral: Bayesian Joint Spike-and-Slab Graphical Lasso »
Tue Jun 11th 04:35 -- 04:40 PM Room Room 101