Title | ||
---|---|---|
Latent subspace modeling of sequential data under the maximum entropy discrimination framework. |
Abstract | ||
---|---|---|
Hidden Markov models (HMMs) are a popular approach for modeling continuous sequential data, typically based on the assumption of Gaussian-distributed observations. A significant issue HMMs with Gaussian conditional densities are confronted with concerns effectively modeling high-dimensional observations, without getting prone to overfitting or singularities. To this end, one can resort to extracting lower-dimensional latent variable representations of the observed high-dimensional data, as part of the inference algorithm of the postulated HMM. Factor analysis (FA) is a well-established linear latent variable scheme that can be employed for this purpose; its functionality consists in modeling the covariances between the elements of multivariate observations under a set of linear assumptions. Recently, it has been proposed that FA can be effectively generalized under an efficient large-margin Bayesian inference perspective, namely maximum entropy discrimination (MED). This work capitalizes on these recent findings to derive an effective HMM-driven sequential data modeling framework for high-dimensional data. Our proposed approach extracts lower-dimensional latent variable representations of observed high-dimensional data, taking into account the large-margin principle. On this basis, it postulates that the data temporal dynamics are conditional to the inferred values of these latent variables. We devise efficient mean-field inference algorithms for our model, and exhibit its advantages through a set of experiments. |
Year | DOI | Venue |
---|---|---|
2018 | 10.1016/j.neucom.2018.05.101 | Neurocomputing |
Keywords | Field | DocType |
Hidden Markov models,Large-margin principle,Maximum-entropy discrimination,Mean-field inference,Latent variable representation | Bayesian inference,Pattern recognition,Subspace topology,Inference,Latent variable,Gaussian,Artificial intelligence,Principle of maximum entropy,Overfitting,Hidden Markov model,Mathematics | Journal |
Volume | ISSN | Citations |
312 | 0925-2312 | 0 |
PageRank | References | Authors |
0.34 | 16 | 1 |
Name | Order | Citations | PageRank |
---|---|---|---|
Sotirios P. Chatzis | 1 | 250 | 24.25 |