Poster
Private Adaptive Optimization with Side information
Tian Li · Manzil Zaheer · Sashank Jakkam Reddi · Virginia Smith
Hall E #1015
Keywords: [ OPT: Non-Convex ] [ Optimization ] [ OPT: Large Scale, Parallel and Distributed ] [ SA: Privacy-preserving Statistics and Machine Learning ]
Adaptive optimization methods have become the default solvers for many machine learning tasks. Unfortunately, the benefits of adaptivity may degrade when training with differential privacy, as the noise added to ensure privacy reduces the effectiveness of the adaptive preconditioner. To this end, we propose AdaDPS, a general framework that uses non-sensitive side information to precondition the gradients, allowing the effective use of adaptive methods in private settings. We formally show AdaDPS reduces the amount of noise needed to achieve similar privacy guarantees, thereby improving optimization performance. Empirically, we leverage simple and readily available side information to explore the performance of AdaDPS in practice, comparing to strong baselines in both centralized and federated settings. Our results show that AdaDPS improves accuracy by 7.7% (absolute) on average---yielding state-of-the-art privacy-utility trade-offs on large-scale text and image benchmarks.