Abstract | ||
---|---|---|
We propose two strategies to improve the optimization in information geometry. First, a local Euclidean embedding is identified by whitening the tangent space, which leads to an additive parameter update sequence that approximates the geodesic flow to the optimal density model. Second, removal of the minor components of gradients enhances the estimation of the Fisher information matrix and reduces the computational cost. We also prove that dimensionality reduction is necessary for learning multidimensional linear transformations. The optimization based on the principal whitened gradients demonstrates faster and more robust convergence in simulations on unsupervised learning with synthetic data and on discriminant analysis of breast cancer data. |
Year | DOI | Venue |
---|---|---|
2008 | 10.1016/j.neunet.2007.12.016 | Neural Networks |
Keywords | Field | DocType |
Information geometry,Natural gradient,Whitening,Principal components,Riemannian manifold | Information geometry,Topology,Mathematical optimization,Dimensionality reduction,Algorithm,Unsupervised learning,Fisher information,Linear discriminant analysis,Artificial neural network,Mathematics,Principal component analysis,Tangent space | Journal |
Volume | Issue | ISSN |
21 | 2 | 0893-6080 |
Citations | PageRank | References |
3 | 0.45 | 8 |
Authors | ||
2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Zhirong Yang | 1 | 289 | 17.27 |
Jorma Laaksonen | 2 | 1162 | 176.93 |