Timezone: »

 
Oral
Locally Private Bayesian Inference for Count Models
Aaron Schein · Steven Wu · Alexandra Schofield · Mingyuan Zhou · Hanna Wallach

Tue Jun 11 04:40 PM -- 05:00 PM (PDT) @ Room 102

We present a general method for privacy-preserving Bayesian inference in Poisson factorization, a broad class of models that includes some of the most widely used models in the social sciences. Our method satisfies limited precision local privacy, a generalization of local differential privacy, which we introduce to formulate privacy guarantees appropriate for sparse count data. We develop an MCMC algorithm that approximates the locally private posterior over model parameters given data that has been locally privatized by the geometric mechanism (Ghosh et al., 2012). Our solution is based on two insights: 1) a novel reinterpretation of the geometric mechanism in terms of the Skellam distribution (Skellam, 1946) and 2) a general theorem that relates the Skellam to the Bessel distribution (Yuan & Kalbfleisch, 2000). We demonstrate our method in two case studies on real-world email data in which we show that our method consistently outperforms the commonly-used \naive approach, obtaining higher quality topics in text and more accurate link prediction in networks. On some tasks, our privacy-preserving method even outperforms non-private inference which conditions on the true data.

Author Information

Aaron Schein (UMass Amherst)
Steven Wu (University of Minnesota)
Alexandra Schofield (Cornell University)
Mingyuan Zhou (University of Texas at Austin)
Hanna Wallach (Microsoft Research)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors