Title
Approximated Geodesic Updates with Principal Natural Gradients
Abstract
We propose a novel optimization algorithm which overcomes two drawbacks of Amari's natural gradient updates for information geometry. First, prewhitening the tangent vectors locally converts a Riemannian manifold to an Euclidean space so that the additive parameter update sequence approximates geodesics. Second, we prove that dimensionality reduction of natural gradients is necessary for learning multidimensional linear transformations. Removal of minor components also leads to noise reduction and better computational efficiency. The proposed method demonstrates faster and more robust convergence in the simulations on recovering a Gaussian mixture of artificial data and on discriminative learning of ionosphere data.
Year
DOI
Venue
2007
10.1109/IJCNN.2007.4371149
IJCNN
Keywords
Field
DocType
amari natural gradient,optimisation,differential geometry,principal natural gradients,multidimensional linear transformations,euclidean space,learning (artificial intelligence),gaussian mixture,dimensionality reduction,discriminative learning,riemannian manifold,optimization,gradient methods,gaussian processes,tangent vectors,geodesic update,information geometry,ionosphere,linear transformation,noise reduction,discrimination learning,learning artificial intelligence
Information geometry,Dimensionality reduction,Computer science,Riemannian manifold,Tangent vector,Euclidean space,Differential geometry,Gaussian process,Artificial intelligence,Machine learning,Geodesic
Conference
ISSN
ISBN
Citations 
1098-7576 E-ISBN : 978-1-4244-1380-5
978-1-4244-1380-5
1
PageRank 
References 
Authors
0.39
0
2
Name
Order
Citations
PageRank
Zhirong Yang128917.27
Jorma Laaksonen21162176.93