Abstract | ||
---|---|---|
Learning general functional dependencies is one of the main goals in machine learning. Recent progress in kernel-based methods has focused on designing flexible and powerful input representations. This paper addresses the complementary issue of problems involving complex outputs such as multiple dependent output variables and structured output spaces. We propose to generalize multiclass Support Vector Machine learning in a formulation that involves features extracted jointly from inputs and outputs. The resulting optimization problem is solved efficiently by a cutting plane algorithm that exploits the sparseness and structural decomposition of the problem. We demonstrate the versatility and effectiveness of our method on problems ranging from supervised grammar learning and named-entity recognition, to taxonomic text classification and sequence alignment. |
Year | DOI | Venue |
---|---|---|
2004 | 10.1145/1015330.1015341 | Support vector machine learning for interdependent and structured output spaces |
Keywords | DocType | ISBN |
support vector machine,present experiment,complex output,complementary issue,classification algorithm,output space,certain output-specific attribute,main goal,structured output space,machine learning,kernel-based method,optimization problem,arbitrary input,vector machine,general functional dependency,multiple dependent output variable,powerful input representation,general loss function,bayesian estimation,markov processes,feature extraction,bias,generating function,reinforcement learning,sequence alignment,variance | Conference | 0-542-12886-1 |
Citations | PageRank | References |
670 | 38.75 | 10 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Ioannis Tsochantaridis | 1 | 2861 | 155.43 |
Thomas Hofmann | 2 | 10064 | 1001.83 |
Thorsten Joachims | 3 | 17387 | 1254.06 |
yasemin altun | 4 | 2463 | 150.46 |