Timezone: »

 
Poster
Oracle Efficient Private Non-Convex Optimization
Seth Neel · Aaron Roth · Giuseppe Vietri · Steven Wu

Tue Jul 14 08:00 AM -- 08:45 AM & Tue Jul 14 09:00 PM -- 09:45 PM (PDT) @ Virtual #None

One of the most effective algorithms for differentially private learning and optimization is \emph{objective perturbation}. This technique augments a given optimization problem (e.g. deriving from an ERM problem) with a random linear term, and then exactly solves it. However, to date, analyses of this approach crucially rely on the convexity and smoothness of the objective function. We give two algorithms that extend this approach substantially. The first algorithm requires nothing except boundedness of the loss function, and operates over a discrete domain. Its privacy and accuracy guarantees hold even without assuming convexity. We are able to extend traditional analyses of objective perturbation by introducing a novel normalization step into the algorithm, which provides enough stability to be differentially private even without second-order conditions. The second algorithm operates over a continuous domain and requires only that the loss function be bounded and Lipschitz in its continuous parameter. Its privacy analysis does not even require convexity. Its accuracy analysis does require convexity, but does not require second order conditions like smoothness. We complement our theoretical results with an empirical evaluation of the non-convex case, in which we use an integer program solver as our optimization oracle. We find that for the problem of learning linear classifiers, directly optimizing for 0/1 loss using our approach can out-perform the more standard approach of privately optimizing a convex-surrogate loss function on the Adult dataset.

Author Information

Seth Neel (University of Pennsylvania)
Aaron Roth (University of Pennsylvania)
Giuseppe Vietri (University of Minnesota)
Steven Wu (University of Minnesota)

More from the Same Authors