Abstract | ||
---|---|---|
We propose a novel algebraic algorithmic framework for dealing with probability distributions represented by their cumulants such as the mean and covariance matrix. As an example, we consider the unsupervised learning problem of finding the subspace on which several probability distributions agree. Instead of minimizing an objective function involving the estimated cumulants, we show that by treating the cumulants as elements of the polynomial ring we can directly solve the problem, at a lower computational cost and with higher accuracy. Moreover, the algebraic viewpoint on probability distributions allows us to invoke the theory of algebraic geometry, which we demonstrate in a compact proof for an identifiability criterion. |
Year | DOI | Venue |
---|---|---|
2012 | 10.5555/2503308.2188416 | Journal of Machine Learning Research |
Keywords | Field | DocType |
novel algebraic algorithmic framework,covariance matrix,estimated cumulants,identifiability criterion,unsupervised learning problem,algebraic geometry,compact proof,algebraic viewpoint,algebraic geometric comparison,higher accuracy,probability distribution,unsupervised learning,cumulant,objective function,polynomial ring | Convolution of probability distributions,Function field of an algebraic variety,K-distribution,Identifiability,Algebraic function,Probability distribution,Artificial intelligence,Geometric distribution,Real algebraic geometry,Mathematics,Machine learning | Journal |
Volume | Issue | ISSN |
13 | 1 | 1532-4435 |
Citations | PageRank | References |
6 | 1.13 | 15 |
Authors | ||
5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Franz J. Király | 1 | 50 | 14.98 |
Paul Von Bünau | 2 | 304 | 15.80 |
Frank C. Meinecke | 3 | 447 | 29.21 |
Duncan A. J. Blythe | 4 | 48 | 4.85 |
Klaus-Robert Müller | 5 | 12756 | 1615.17 |