Differentially Private Submodular Maximization with a Knapsack Constraint
Ron Zadicario ⋅ Tova Milo
Abstract
Submodular maximization subject to a knapsack constraint (SMK) is a fundamental problem in discrete optimization, with wide-ranging applications in machine learning and related fields. As these applications increasingly involve sensitive individual data, there is a growing need for high-utility algorithms that provide formal privacy guarantees. In this work, we study the SMK problem under differential privacy, considering both monotone and non-monotone objective functions. For monotone objectives, we propose a differentially private algorithm that achieves the optimal $(1-1/e)$-approximation ratio while significantly improving both additive error and query complexity over prior work. We also present a more efficient variant attaining a $1/2$-approximation. For non-monotone objectives, we introduce, to our knowledge, the first differentially private algorithm with provable guarantees, achieving a $1/4$-approximation in expectation and an additive error comparable to the best known for monotone objectives.
Successful Page Load