Skip to yearly menu bar Skip to main content


Poster

Convergence of Some Convex Message Passing Algorithms to a Fixed Point

Václav Voráček · Tomáš Werner


Abstract: A popular approach to the MAP inference problem in graphical models is to minimize an upper bound obtained from a dual linear programming or Lagrangian relaxation by (block-)coordinate descent.Examples of such algorithms are max-sum diffusion and sequential tree-reweighted message passing.Convergence properties of these methods are currently not fully understood.They have been proved to converge to the set characterized by local consistency of active constraints, with unknown convergence rate; however, it was not clear if the iterates converge at all (to any single point). We prove a stronger result (which was conjectured before but never proved): the iterates converge to a fixed point of the algorithm.Moreover, we show that they achieve precision $\varepsilon>0$ in $\mathcal{O}(1/\varepsilon)$ iterations.We first prove this for a version of coordinate descent applied to a general piecewise-affine convex objective, using a novel proof technique. Then we demonstrate the generality of this approach by reducing some popular coordinate-descent algorithms to this problem.Finally we show that, in contrast to our main result, a similar version of coordinate descent applied to a constrained optimization problem need not converge.

Live content is unavailable. Log in and register to view live content