Abstract | ||
---|---|---|
We present a method for sequential learning of increasingly complex graphical models for discriminating between two hypotheses. We generate forests for each hypothesis, each with no more edges than a spanning tree, which optimize an information-theoretic criteria. The method relies on a straightforward extension of the efficient max-weight spanning tree (MWST) algorithm by incorporating multivalued edge-weights. Each iteration produces nested forests with increasing number of edges; each provably optimal as compared to alternative forests. Empirical results demonstrate superior probability of error as compared to generative approaches. |
Year | DOI | Venue |
---|---|---|
2008 | 10.1109/ICASSP.2008.4518000 | Las Vegas, NV |
Keywords | Field | DocType |
error statistics,trees (mathematics),alternative forests,complex graphical models,efficient max-weight spanning tree algorithm,error probability,max-weight discriminative forests,multivalued edge-weights,sequential learning,Discriminative Learning,Hypothesis Testing,Learning Graphical Models,Max-Weight Trees/Forests | Pattern recognition,Computer science,Spanning tree,Artificial intelligence,Graphical model,Probability of error,Sequence learning,Discriminative model,Statistical hypothesis testing,Machine learning,Discriminative learning | Conference |
ISSN | ISBN | Citations |
1520-6149 E-ISBN : 978-1-4244-1484-0 | 978-1-4244-1484-0 | 1 |
PageRank | References | Authors |
0.38 | 2 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Vincent Yan Fu Tan | 1 | 490 | 76.15 |
John W. Fisher III | 2 | 878 | 74.44 |
Alan S. Willsky | 3 | 7466 | 847.01 |
Fisher, J.W. | 4 | 1 | 0.38 |