Skip to yearly menu bar Skip to main content


Contributed talk
in
Workshop: Negative Dependence and Submodularity: Theory and Applications in Machine Learning

Constrained Maximization of Lattice Submodular Functions

Aytunc Sahin · Joachim Buhmann · Andreas Krause


Abstract:

Submodular optimization over the integer lattice has many applications in machine learning. Although the constrained maximization of submodular functions with coordinate-wise concavity (also called {\em DR-Submodular} functions) is well studied, the maximization of {\em general} lattice submodular functions is considerably more challenging. In this work, we first show that we can optimize lattice submodular functions subject to a discrete (integer) polymatroid constraint using a recently proposed extension, called the Generalized Multilinear Extension. Then, we establish a bound on the rounding error for the discrete polymatroid constraint, which depends on the “distance” between the lattice submodular function to a DR-Submodular function. Lastly, we demonstrate the effectiveness of our algorithm on a Bayesian experimental design problem with repetition and a concave cost.

Chat is not available.