Title
Hidden Markov Support Vector Machines
Abstract
This paper presents a novel discriminative learning technique for label sequences based on a combination of the two most success- ful learning algorithms, Support Vector Ma- chines and Hidden Markov Models which we call Hidden Markov Support Vector Ma- chine. The proposed architecture handles dependencies between neighboring labels us- ing Viterbi decoding. In contrast to stan- dard HMM training, the learning procedure is discriminative and is based on a maxi- mum/soft margin criterion. Compared to previous methods like Conditional Random Fields, Maximum Entropy Markov Models and label sequence boosting, HM-SVMs have a number of advantages. Most notably, it is possible to learn non-linear discriminant functions via kernel functions. At the same time, HM-SVMs share the key advantages with other discriminative methods, in partic- ular the capability to deal with overlapping features. We report experimental evaluations on two tasks, named entity recognition and part-of-speech tagging, that demonstrate the competitiveness of the proposed approach.
Year
Venue
Keywords
2003
ICML
conditional random field,kernel function,support vector,discrimination learning,markov model,hidden markov model,support vector machine,maximum entropy,viterbi decoder
Field
DocType
Citations 
Conditional random field,Maximum-entropy Markov model,Sequence labeling,Pattern recognition,Forward algorithm,Markov model,Computer science,Variable-order Markov model,Artificial intelligence,Hidden Markov model,Viterbi algorithm,Machine learning
Conference
257
PageRank 
References 
Authors
29.54
9
4
Search Limit
100257
Name
Order
Citations
PageRank
yasemin altun12463150.46
Ioannis Tsochantaridis22861155.43
Thomas Hofmann3100641001.83
t fawcett n mishra425729.54