Timezone: »
Oral
Multivariate Submodular Optimization
Richard Santiago · F. Bruce Shepherd
Submodular functions have found a wealth of new applications in data science and machine learning models in recent years. This has been coupled with many algorithmic advances in the area of submodular optimization: (SO) $\min/\max~f(S): S \in \mathcal{F}$, where $\mathcal{F}$ is a given family of feasible sets over a ground set $V$ and $f:2^V \rightarrow \mathbb{R}$ is submodular. In this work we focus on a more general class of \emph{multivariate submodular optimization} (MVSO) problems: $\min/\max~f (S_1,S_2,\ldots,S_k): S_1 \uplus S_2 \uplus \cdots \uplus S_k \in \mathcal{F}$. Here we use $\uplus$ to denote disjoint union and hence this model is attractive where resources are being allocated across $k$ agents, who share a ``joint'' multivariate nonnegative objective $f(S_1,S_2,\ldots,S_k)$ that captures some type of submodularity (i.e. diminishing returns) property. We provide some explicit examples and potential applications for this new framework. For maximization, we show that practical algorithms such as accelerated greedy variants and distributed algorithms achieve good approximation guarantees for very general families (such as matroids and $p$systems). For arbitrary families, we show that monotone (resp. nonmonotone) MVSO admits an $\alpha (11/e)$ (resp. $\alpha \cdot 0.385$) approximation whenever monotone (resp. nonmonotone) SO admits an $\alpha$approximation over the multilinear formulation. This substantially expands the family of tractable models. On the minimization side we give essentially optimal approximations in terms of the curvature of $f$.
Author Information
Richard Santiago (McGill University)
F. Bruce Shepherd (University of British Columbia)
Related Events (a corresponding poster, oral, or spotlight)

2019 Poster: Multivariate Submodular Optimization »
Thu Jun 13th 01:30  04:00 AM Room Pacific Ballroom