Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Theory and Practice of Differential Privacy

Improved Rates for Differentially Private Stochastic Convex Optimization with Heavy-Tailed Data

Gautam Kamath · Xingtu Liu · Huanyu Zhang


Abstract: We study stochastic convex optimization with heavy-tailed data under the constraint of differential privacy. Most prior work on this problem is restricted to the case where the loss function is Lipschitz. Instead, as introduced by Wang, Xiao, Devadas, and Xu~\cite{WangXDX20}, we study general convex loss functions with the assumption that the distribution of gradients has bounded $k$-th moments. We provide improved upper bounds on the excess population risk under approximate differential privacy of $\tilde{O}\left(\sqrt{\frac{d}{n}}+\left(\frac{d}{\epsilon n}\right)^{\frac{k-1}{k}}\right)$ and $\tilde{O}\left(\frac{d}{n}+\left(\frac{d}{\epsilon n}\right)^{\frac{2k-2}{k}}\right)$ for convex and strongly convex loss functions, respectively. We also prove nearly-matching lower bounds under the constraint of pure differential privacy, giving strong evidence that our bounds are tight.

Chat is not available.