Skip to yearly menu bar Skip to main content


Poster

Consistent Structured Prediction with Max-Min Margin Markov Networks

Alex Nowak · Francis Bach · Alessandro Rudi

Keywords: [ Computational Learning Theory ] [ Non-parametric Methods ] [ Structured Prediction ] [ Supervised Learning ] [ Sequential, Network, and Time-Series Modeling ]


Abstract: Max-margin methods for binary classification such as the support vector machine (SVM) have been extended to the structured prediction setting under the name of max-margin Markov networks ($M^3N$), or more generally structural SVMs. Unfortunately, these methods are statistically inconsistent when the relationship between inputs and labels is far from deterministic. We overcome such limitations by defining the learning problem in terms of a ``max-min'' margin formulation, naming the resulting method max-min margin Markov networks ($M^4N$). We prove consistency and finite sample generalization bounds for $M^4N$ and provide an explicit algorithm to compute the estimator. The algorithm achieves a generalization error of $O(1/\sqrt{n})$ for a total cost of $O(n)$ projection-oracle calls (which have at most the same cost as the max-oracle from $M^3N$). Experiments on multi-class classification, ordinal regression, sequence prediction and matching demonstrate the effectiveness of the proposed method.

Chat is not available.