Timezone: »

Private Adaptive Optimization with Side information
Tian Li · Manzil Zaheer · Sashank Jakkam Reddi · Virginia Smith

Wed Jul 20 02:35 PM -- 02:40 PM (PDT) @ Room 307

Adaptive optimization methods have become the default solvers for many machine learning tasks. Unfortunately, the benefits of adaptivity may degrade when training with differential privacy, as the noise added to ensure privacy reduces the effectiveness of the adaptive preconditioner. To this end, we propose AdaDPS, a general framework that uses non-sensitive side information to precondition the gradients, allowing the effective use of adaptive methods in private settings. We formally show AdaDPS reduces the amount of noise needed to achieve similar privacy guarantees, thereby improving optimization performance. Empirically, we leverage simple and readily available side information to explore the performance of AdaDPS in practice, comparing to strong baselines in both centralized and federated settings. Our results show that AdaDPS improves accuracy by 7.7% (absolute) on average---yielding state-of-the-art privacy-utility trade-offs on large-scale text and image benchmarks.

Author Information

Tian Li (Carnegie Mellon University)
Manzil Zaheer (Google Research)
Sashank Jakkam Reddi (Google)
Virginia Smith (Carnegie Mellon University)
Virginia Smith

Virginia Smith is an assistant professor in the Machine Learning Department at Carnegie Mellon University.

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors