Poster
Private Truly-Everlasting Robust-Prediction
Uri Stemmer
Hall C 4-9 #2304
[
Abstract
]
[ Paper PDF ]
Oral
presentation:
Oral 2C Privacy
Tue 23 Jul 7:30 a.m. PDT — 8:30 a.m. PDT
Tue 23 Jul 4:30 a.m. PDT
— 6 a.m. PDT
Tue 23 Jul 7:30 a.m. PDT — 8:30 a.m. PDT
Abstract:
Private everlasting prediction (PEP), recently introduced by Naor et al. [2023], is a model for differentially private learning in which the learner never publicly releases a hypothesis. Instead, it provides black-box access to a "prediction oracle" that can predict the labels of an *endless stream* of unlabeled examples drawn from the underlying distribution. Importantly, PEP provides privacy both for the initial training set and for the endless stream of classification queries. We present two conceptual modifications to the definition of PEP, as well as new constructions exhibiting significant improvements over prior work. Specifically, we incorporate robustness against poisoning attacks into the definition of PEP; we present a relaxed privacy definition, suitable for PEP, that allows us to disconnect the privacy parameter $\delta$ from the number of total time steps $T$; and we present new constructions for axis-aligned rectangles and decision-stumps exhibiting improved sample complexity and runtime.
Chat is not available.