Title
Principal whitened gradient for information geometry.
Abstract
We propose two strategies to improve the optimization in information geometry. First, a local Euclidean embedding is identified by whitening the tangent space, which leads to an additive parameter update sequence that approximates the geodesic flow to the optimal density model. Second, removal of the minor components of gradients enhances the estimation of the Fisher information matrix and reduces the computational cost. We also prove that dimensionality reduction is necessary for learning multidimensional linear transformations. The optimization based on the principal whitened gradients demonstrates faster and more robust convergence in simulations on unsupervised learning with synthetic data and on discriminant analysis of breast cancer data.
Year
DOI
Venue
2008
10.1016/j.neunet.2007.12.016
Neural Networks
Keywords
Field
DocType
Information geometry,Natural gradient,Whitening,Principal components,Riemannian manifold
Information geometry,Topology,Mathematical optimization,Dimensionality reduction,Algorithm,Unsupervised learning,Fisher information,Linear discriminant analysis,Artificial neural network,Mathematics,Principal component analysis,Tangent space
Journal
Volume
Issue
ISSN
21
2
0893-6080
Citations 
PageRank 
References 
3
0.45
8
Authors
2
Name
Order
Citations
PageRank
Zhirong Yang128917.27
Jorma Laaksonen21162176.93