Timezone: »
In this paper we analyze an adaptive sampling approach for submodular maximization. Adaptive sampling is a technique that has recently been shown to achieve a constant factor approximation guarantee for submodular maximization under a cardinality constraint with exponentially fewer adaptive rounds than any previously studied constant factor approximation algorithm for this problem. Adaptivity quantifies the number of sequential rounds that an algorithm makes when function evaluations can be executed in parallel and is the parallel running time of an algorithm, up to low order terms. Adaptive sampling achieves its exponential speedup at the expense of approximation. In theory, it is guaranteed to produce a solutionthat is a 1/3 approximation to the optimum. Nevertheless, experiments show that adaptive sampling techniques achieve far better values in practice.In this paper we provide theoretical justification for this phenomenon. In particular, we showthat under very mild conditions of curvature of a function, adaptive sampling techniques achieve an approximation arbitrarily close to 1/2 while maintaining their low adaptivity. Furthermore, we show that the approximation ratio approaches 1 in direct relationship to a homogeneity property of the submodular function. In addition, we conductexperiments on real data sets in which the curvature and homogeneity properties can be easilymanipulated and demonstrate the relationship between approximation and curvature, as well asthe effectiveness of adaptive sampling in practice.
Author Information
Eric Balkanski (Harvard)
Yaron Singer (Harvard)
Related Events (a corresponding poster, oral, or spotlight)
-
2018 Poster: Approximation Guarantees for Adaptive Sampling »
Thu. Jul 12th 04:15 -- 07:00 PM Room Hall B #117
More from the Same Authors
-
2021 Poster: Instance Specific Approximations for Submodular Maximization »
Eric Balkanski · Sharon Qian · Yaron Singer -
2021 Spotlight: Instance Specific Approximations for Submodular Maximization »
Eric Balkanski · Sharon Qian · Yaron Singer -
2020 : Exponentially Faster Algorithms for Machine Learning »
Yaron Singer -
2020 Poster: Predicting Choice with Set-Dependent Aggregation »
Nir Rosenfeld · Kojin Oshiba · Yaron Singer -
2020 Poster: The FAST Algorithm for Submodular Maximization »
Adam Breuer · Eric Balkanski · Yaron Singer -
2019 Poster: Robust Influence Maximization for Hyperparametric Models »
Dimitrios Kalimeris · Gal Kaplun · Yaron Singer -
2019 Oral: Robust Influence Maximization for Hyperparametric Models »
Dimitrios Kalimeris · Gal Kaplun · Yaron Singer -
2018 Poster: Learning Diffusion using Hyperparameters »
Dimitrios Kalimeris · Yaron Singer · Karthik Subbian · Udi Weinsberg -
2018 Poster: Learning to Optimize Combinatorial Functions »
Nir Rosenfeld · Eric Balkanski · Amir Globerson · Yaron Singer -
2018 Oral: Learning to Optimize Combinatorial Functions »
Nir Rosenfeld · Eric Balkanski · Amir Globerson · Yaron Singer -
2018 Oral: Learning Diffusion using Hyperparameters »
Dimitrios Kalimeris · Yaron Singer · Karthik Subbian · Udi Weinsberg -
2017 Poster: Robust Guarantees of Stochastic Greedy Algorithms »
Yaron Singer · Avinatan Hassidim -
2017 Talk: Robust Guarantees of Stochastic Greedy Algorithms »
Yaron Singer · Avinatan Hassidim