Poster
in
Workshop: Theory and Practice of Differential Privacy
The Sample Complexity of Distribution-Free Parity Learning in theRobust Shuffle Model
kobbi nissim · Chao Yan
Abstract:
We provide a lowerbound on the sample complexity of distribution-free parity learning in the realizable case in the shuffle model of differential privacy. Namely, we show that the sample complexity of learning $d$-bit parity functions is $\Omega(2^{d/2})$. Our result extends a recent similar lowerbound on the sample complexity of private agnostic learning of parity functions in the shuffle model by Cheu and Ullman. We also sketch a simple shuffle model protocol demonstrating that our results are tight up to $\mbox{poly}(d)$ factors.
Chat is not available.