Timezone: »

 
Oral
Multi-Epoch Matrix Factorization Mechanisms for Private Machine Learning
Christopher Choquette-Choo · Hugh B McMahan · J K Rush · Abhradeep Guha Thakurta

Wed Jul 26 08:12 PM -- 08:20 PM (PDT) @ Meeting Room 316 A-C
We introduce new differentially private (DP) mechanisms for gradient-based machine learning (ML) with multiple passes (epochs) over a dataset, substantially improving the achievable privacy-utility-computation tradeoffs. We formalize the problem of DP mechanisms for adaptive streams with multiple participations and introduce a non-trivial extension of online matrix factorization DP mechanisms to our setting. This includes establishing the necessary theory for sensitivity calculations and efficient computation of optimal matrices. For some applications like $>\!\! 10,000$ SGD steps, applying these optimal techniques becomes computationally expensive. We thus design an efficient Fourier-transform-based mechanism with only a minor utility loss. Extensive empirical evaluation on both example-level DP for image classification and user-level DP for language modeling demonstrate substantial improvements over all previous methods, including the widely-used DP-SGD. Though our primary application is to ML, our main DP results are applicable to arbitrary linear queries and hence may have much broader applicability.

Author Information

Christopher Choquette-Choo (Google Deepmind)
Hugh B McMahan (Google)
J K Rush (Google)
Abhradeep Guha Thakurta (Google Deepmind)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors