Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Theory and Practice of Differential Privacy

Differentially Private Classification via 0-1 Loss

Ryan McKenna


Abstract: Classification is one of the most important tasks involving data. Traditional approaches for classification use the framework of empirical risk minimization on a convex surrogate loss. Due to the importance of this problem, differentially private empirical risk minimization has been subject of intense research effort dating back to the influential work of Chaudhuri et al. In this short paper, we propose an alternate approach to differentially private classification, based on the $0-1$ loss function. In theory, this method offers a very appealing alternative under differential privacy requirements. We argue that differentially private classification is no harder than the canonical ``most common medical condition'' problem, which is easily solved by the exponential mechanism. In practice, this approach works remarkably well, although there are challenges to implementing it exactly for high-dimensional data.

Chat is not available.