Timezone: »

 
Local Differential Privacy with Entropic Wasserstein Distance
Daria Reshetova · Wei-Ning Chen · Ayfer Ozgur
Event URL: https://openreview.net/forum?id=I0ZPKZOs6G »

Local differential privacy (LDP) is a powerful method for privacy-preserving data collection. In this paper, we develop a framework for training Generative Adversarial Networks (GAN) on differentially privatized data. We show that entropic regularization of the Wasserstein distance - a popular regularization method in the literature that has been often leveraged for its computational benefits - can be used to denoise the data distribution when data is privatized by popular additive noise mechanisms, such as Laplace and Gaussian. This combination uniquely enables the mitigation of both the regularization bias and the effects of privatization noise, thereby enhancing the overall efficacy of the model. We analyze the proposed method, provide sample complexity results and experimental evidence to support its efficacy.

Author Information

Daria Reshetova (Stanford University)
Wei-Ning Chen (Stanford University)
Wei-Ning Chen

Wei-Ning Chen is currently a Ph.D. student at Stanford EE under the support of Stanford Graduate Fellowship (SGF). His research interests broadly lie in information-theoretic and algorithmic aspects of data science. He adopt tools mainly from information theory, theoretical machine learning, and statistical inference, with a current focus on distributed inference, federated learning and differential privacy.

Ayfer Ozgur (Stanford University)

More from the Same Authors