Abstract | ||
---|---|---|
Stochastic matrices are arrays whose elements are discrete probabilities. They are widely used in techniques such as Markov Chains, probabilistic latent semantic analysis, etc. In such learning problems, the learned matrices, being stochastic matrices, are non-negative and all or part of the elements sum up to one. Conventional multiplicative updates which have been widely used for nonnegative learning cannot accommodate the stochasticity constraint. Simply normalizing the nonnegative matrix in learning at each step may have an adverse effect on the convergence of the optimization algorithm. Here we discuss and compare two alternative ways in developing multiplicative update rules for stochastic matrices. One reparameterizes the matrices before applying the multiplicative update principle, and the other employs relaxation with Lagrangian multipliers such that the updates jointly optimize the objective and steer the estimate towards the constraint manifold. We compare the new methods against the conventional normalization approach on two applications, parameter estimation of Hidden Markov Chain Model and Information-Theoretic Clustering. Empirical studies on both synthetic and real-world datasets demonstrate that the algorithms using the new methods perform more stably and efficiently than the conventional ones. |
Year | DOI | Venue |
---|---|---|
2013 | 10.1007/978-3-642-38886-6_14 | IMAGE ANALYSIS, SCIA 2013: 18TH SCANDINAVIAN CONFERENCE |
Keywords | Field | DocType |
nonnegative learning, stochastic matrix, multiplicative update | Mathematical optimization,Multiplicative function,Nonnegative matrix,Stochastic matrix,Computer science,Matrix (mathematics),Markov chain,Probabilistic latent semantic analysis,Cluster analysis,Hidden Markov model | Conference |
Volume | ISSN | Citations |
7944 | 0302-9743 | 2 |
PageRank | References | Authors |
0.37 | 5 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Zhanxing Zhu | 1 | 199 | 29.61 |
Zhirong Yang | 2 | 289 | 17.27 |
Erkki Oja | 3 | 6701 | 797.08 |