Abstract | ||
---|---|---|
Hidden Markov Models (HMMs) are very popular generative models for sequence data. Recent work has, however, shown that on many tasks, Conditional Random Fields (CRFs), a type of discriminative model, perform better than HMMs. We propose Hierarchical Hidden Conditional Random Fields (HHCRFs), a discriminative model corresponding to hierarchical HMMs (HHMMs). HHCRFs model the conditional probability of the states at the upper levels given observations. The states at the lower levels are hidden and marginalized in the model definition. We have developed two algorithms for the model: a parameter learning algorithm that needs only the states at the upper levels in the training data and the marginalized Viterbi algorithm, which computes the most likely state sequences at the upper levels by marginalizing the states at the lower levels. In an experiment that involves segmenting electroencephalographic (EEG) data for a Brain-Computer Interface, HHCRFs outperform HHMMs. |
Year | DOI | Venue |
---|---|---|
2007 | 10.1007/978-3-540-77226-2_39 | IDEAL |
Keywords | Field | DocType |
discriminative model,training data,hhcrfs model,lower level,upper level,hierarchical hmms,popular generative model,sequence data,conditional random fields,model definition,conditional random field,viterbi algorithm,brain computer interface,hidden markov model,conditional probability | Maximum-entropy Markov model,Forward algorithm,Computer science,Artificial intelligence,Discriminative model,Viterbi algorithm,Hidden semi-Markov model,Pattern recognition,Markov model,Speech recognition,Hidden Markov model,Machine learning,Generative model | Conference |
Volume | ISSN | ISBN |
4881 | 0302-9743 | 3-540-77225-1 |
Citations | PageRank | References |
6 | 0.51 | 13 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Takaaki Sugiura | 1 | 6 | 0.51 |
Naoto Goto | 2 | 6 | 0.51 |
Akira Hayashi | 3 | 51 | 9.08 |