Timezone: »

 
Poster
Differentially Private Coordinate Descent for Composite Empirical Risk Minimization
Paul Mangold · Aurélien Bellet · Joseph Salmon · Marc Tommasi

Wed Jul 20 03:30 PM -- 05:30 PM (PDT) @ Hall E #1009

Machine learning models can leak information about the data used to train them. To mitigate this issue, Differentially Private (DP) variants of optimization algorithms like Stochastic Gradient Descent (DP-SGD) have been designed to trade-off utility for privacy in Empirical Risk Minimization (ERM) problems. In this paper, we propose Differentially Private proximal Coordinate Descent (DP-CD), a new method to solve composite DP-ERM problems. We derive utility guarantees through a novel theoretical analysis of inexact coordinate descent. Our results show that, thanks to larger step sizes, DP-CD can exploit imbalance in gradient coordinates to outperform DP-SGD. We also prove new lower bounds for composite DP-ERM under coordinate-wise regularity assumptions, that are nearly matched by DP-CD. For practical implementations, we propose to clip gradients using coordinate-wise thresholds that emerge from our theory, avoiding costly hyperparameter tuning. Experiments on real and synthetic data support our results, and show that DP-CD compares favorably with DP-SGD.

Author Information

Paul Mangold (Inria)
Aurélien Bellet (INRIA)
Joseph Salmon (Université de Montpellier)
Marc Tommasi

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors