Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Humans, Algorithmic Decision-Making and Society: Modeling Interactions and Impact

Privacy-Efficacy Tradeoff of Clipped SGD with Decision-dependent Data

Qiang Li · Michal Yemini · Hoi To Wai


Abstract:

This paper studies the privacy-efficacy tradeoff of clipped SGD algorithms when there is an interplay between the data distribution and the model deployed by the algorithm during training, also known as the performative prediction setting. Our contributions are two-fold. First, we show that the projected clipped SGD (PCSGD) algorithm may converge to a biased solution bounded away from the performative stable point. We quantify the lower and upper bound for the bias magnitude and demonstrate a bias amplification phenomenon where the bias grows with the sensitivity of the data distribution. Second, we suggest remedies to trade-off between the clipping bias and privacy guarantee using an asymptotically optimal step size design for PCSGD. Numerical experiments are presented to verify our analysis.

Chat is not available.