Timezone: »

Differentially Private Classification via 0-1 Loss
Ryan McKenna
Classification is one of the most important tasks involving data. Traditional approaches for classification use the framework of empirical risk minimization on a convex surrogate loss. Due to the importance of this problem, differentially private empirical risk minimization has been subject of intense research effort dating back to the influential work of Chaudhuri et al. In this short paper, we propose an alternate approach to differentially private classification, based on the $0-1$ loss function. In theory, this method offers a very appealing alternative under differential privacy requirements. We argue that differentially private classification is no harder than the canonical ``most common medical condition'' problem, which is easily solved by the exponential mechanism. In practice, this approach works remarkably well, although there are challenges to implementing it exactly for high-dimensional data.

Author Information

Ryan McKenna (UMass Amherst)

More from the Same Authors