Title
Infinite Markov-Switching Maximum Entropy Discrimination Machines.
Abstract
In this paper, we present a method that combines the merits of Bayesian nonparametrics, specifically stick-breaking priors, and large-margin kernel machines in the context of sequential data classification. The proposed model postulates a set of (theoretically) infinite interdependent large-margin classifiers as model components, that robustly capture local nonlinearity of complex data. The postulated large-margin classifiers are connected in the context of a Markov-switching construction that allows for capturing complex temporal dynamics in the modeled datasets. Appropriate stick-breaking priors are imposed over the component switching mechanism of our model to allow for data-driven determination of the optimal number of component large-margin classifiers, under a standard nonparametric Bayesian inference scheme. Efficient model training is performed under the maximum entropy discrimination (MED) framework, which integrates the large-margin principle with Bayesian posterior inference. We evaluate our method using several real-world datasets, and compare it to state-of-the-art alternatives.
Year
Venue
Field
2013
ICML
Kernel (linear algebra),Nonlinear system,Pattern recognition,Inference,Computer science,Markov chain,Complex data type,Artificial intelligence,Principle of maximum entropy,Prior probability,Machine learning,Bayesian probability
DocType
Citations 
PageRank 
Conference
1
0.35
References 
Authors
4
1
Name
Order
Citations
PageRank
Sotirios P. Chatzis1305.94