Timezone: »
In this talk I’ll describe a novel approach that yields algorithms whose parallel running time is exponentially faster than any algorithm previously known for a broad range of machine learning applications. The algorithms are designed for submodular function maximization which is the algorithmic engine behind applications such as clustering, network analysis, feature selection, Bayesian inference, ranking, speech and document summarization, recommendation systems, hyperparameter tuning, and many others. Since applications of submodular functions are ubiquitous across machine learning and data sets become larger, there is consistent demand for accelerating submodular optimization. The approach we describe yields simple algorithms whose parallel runtime is logarithmic in the size of the data rather than linear. I’ll introduce the frameworks we recently developed and present experimental results from various application domains.
Author Information
Yaron Singer (Harvard)
More from the Same Authors
-
2021 Poster: Instance Specific Approximations for Submodular Maximization »
Eric Balkanski · Sharon Qian · Yaron Singer -
2021 Spotlight: Instance Specific Approximations for Submodular Maximization »
Eric Balkanski · Sharon Qian · Yaron Singer -
2020 Poster: Predicting Choice with Set-Dependent Aggregation »
Nir Rosenfeld · Kojin Oshiba · Yaron Singer -
2020 Poster: The FAST Algorithm for Submodular Maximization »
Adam Breuer · Eric Balkanski · Yaron Singer -
2019 Poster: Robust Influence Maximization for Hyperparametric Models »
Dimitrios Kalimeris · Gal Kaplun · Yaron Singer -
2019 Oral: Robust Influence Maximization for Hyperparametric Models »
Dimitrios Kalimeris · Gal Kaplun · Yaron Singer -
2018 Poster: Approximation Guarantees for Adaptive Sampling »
Eric Balkanski · Yaron Singer -
2018 Oral: Approximation Guarantees for Adaptive Sampling »
Eric Balkanski · Yaron Singer -
2018 Poster: Learning Diffusion using Hyperparameters »
Dimitrios Kalimeris · Yaron Singer · Karthik Subbian · Udi Weinsberg -
2018 Poster: Learning to Optimize Combinatorial Functions »
Nir Rosenfeld · Eric Balkanski · Amir Globerson · Yaron Singer -
2018 Oral: Learning to Optimize Combinatorial Functions »
Nir Rosenfeld · Eric Balkanski · Amir Globerson · Yaron Singer -
2018 Oral: Learning Diffusion using Hyperparameters »
Dimitrios Kalimeris · Yaron Singer · Karthik Subbian · Udi Weinsberg -
2017 Poster: Robust Guarantees of Stochastic Greedy Algorithms »
Yaron Singer · Avinatan Hassidim -
2017 Talk: Robust Guarantees of Stochastic Greedy Algorithms »
Yaron Singer · Avinatan Hassidim