Support Vector Machine Learning for Interdependent and Structured Output Spaces |
---|
Ioannis Tsochantaridis - Brown University Thomas Hofmann - Brown University Thorsten Joachims - Cornell University Yasemin Altun - Brown University |
Learning general functional dependencies is one of the main goals in machine learning. Recent progress in kernel-based methods has focused on designing flexible and powerful input representations. This paper addresses the complementary issue of problems involving complex outputs such as multiple dependent output variables and structured output spaces. We propose to generalize multiclass Support Vector Machine learning in a formulation that involves features extracted jointly from inputs and outputs. The resulting optimization problem is solved efficiently by a cutting plane algorithm that exploits the sparseness and structural decomposition of the problem. We demonstrate the versatility and effectiveness of our method on problemsranging from supervised grammar learning and named-entity recognition, totaxonomic text classification and sequence alignment. |