Timezone: »

Graph Cuts Always Find a Global Optimum for Potts Models (With a Catch)
Hunter Lang · David Sontag · Aravindan Vijayaraghavan

Thu Jul 22 07:00 PM -- 07:20 PM (PDT) @

We prove that the alpha-expansion algorithm for MAP inference always returns a globally optimal assignment for Markov Random Fields with Potts pairwise potentials, with a catch: the returned assignment is only guaranteed to be optimal for an instance within a small perturbation of the original problem instance. In other words, all local minima with respect to expansion moves are global minima to slightly perturbed versions of the problem. On "real-world" instances, MAP assignments of small perturbations of the problem should be very similar to the MAP assignment(s) of the original problem instance. We design an algorithm that can certify whether this is the case in practice. On several MAP inference problem instances from computer vision, this algorithm certifies that MAP solutions to all of these perturbations are very close to solutions of the original instance. These results taken together give a cohesive explanation for the good performance of "graph cuts" algorithms in practice. Every local expansion minimum is a global minimum in a small perturbation of the problem, and all of these global minima are close to the original solution.

Author Information

Hunter Lang (MIT)
David Sontag (Massachusetts Institute of Technology)
Aravindan Vijayaraghavan (Northwestern University)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors