Timezone: »
In this paper we analyze the robustness of stochastic variants of the greedy algorithm for submodular maximization. Our main result shows that for maximizing a monotone submodular function under a cardinality constraint, iteratively selecting an element whose marginal contribution is approximately maximal in expectation is a sufficient condition to obtain the optimal approximation guarantee with exponentially high probability, assuming the cardinality is sufficiently large. One consequence of our result is that the linear-time STOCHASTIC-GREEDY algorithm recently proposed in (Mirzasoleiman et al.,2015) achieves the optimal running time while maintaining an optimal approximation guarantee. We also show that high probability guarantees cannot be obtained for stochastic greedy algorithms under matroid constraints, and prove an approximation guarantee which holds in expectation. In contrast to the guarantees of the greedy algorithm, we show that the approximation ratio of stochastic local search is arbitrarily bad, with high probability, as well as in expectation.
Author Information
Yaron Singer (Harvard)
Avinatan Hassidim (Bar Ilan University)
Related Events (a corresponding poster, oral, or spotlight)
-
2017 Talk: Robust Guarantees of Stochastic Greedy Algorithms »
Wed. Aug 9th 03:30 -- 03:48 AM Room Parkside 2
More from the Same Authors
-
2021 Poster: Instance Specific Approximations for Submodular Maximization »
Eric Balkanski · Sharon Qian · Yaron Singer -
2021 Spotlight: Instance Specific Approximations for Submodular Maximization »
Eric Balkanski · Sharon Qian · Yaron Singer -
2020 : Exponentially Faster Algorithms for Machine Learning »
Yaron Singer -
2020 Poster: Predicting Choice with Set-Dependent Aggregation »
Nir Rosenfeld · Kojin Oshiba · Yaron Singer -
2020 Poster: The FAST Algorithm for Submodular Maximization »
Adam Breuer · Eric Balkanski · Yaron Singer -
2019 Poster: Robust Influence Maximization for Hyperparametric Models »
Dimitrios Kalimeris · Gal Kaplun · Yaron Singer -
2019 Oral: Robust Influence Maximization for Hyperparametric Models »
Dimitrios Kalimeris · Gal Kaplun · Yaron Singer -
2018 Poster: Approximation Guarantees for Adaptive Sampling »
Eric Balkanski · Yaron Singer -
2018 Oral: Approximation Guarantees for Adaptive Sampling »
Eric Balkanski · Yaron Singer -
2018 Poster: Learning Diffusion using Hyperparameters »
Dimitrios Kalimeris · Yaron Singer · Karthik Subbian · Udi Weinsberg -
2018 Poster: Learning to Optimize Combinatorial Functions »
Nir Rosenfeld · Eric Balkanski · Amir Globerson · Yaron Singer -
2018 Oral: Learning to Optimize Combinatorial Functions »
Nir Rosenfeld · Eric Balkanski · Amir Globerson · Yaron Singer -
2018 Oral: Learning Diffusion using Hyperparameters »
Dimitrios Kalimeris · Yaron Singer · Karthik Subbian · Udi Weinsberg