Title
Preconditioner on Matrix Lie Group for SGD
Abstract
We study two types of preconditioners and preconditioned stochastic gradient descent (SGD) methods in a unified framework. We call the first one the Newton type due to its close relationship to the Newton method, and the second one the Fisher type as its preconditioner is closely related to the inverse of Fisher information matrix. Both preconditioners can be derived from one framework, and efficiently estimated on any matrix Lie groups designated by the user using natural or relative gradient descent minimizing certain preconditioner estimation criteria. Many existing preconditioners and methods, e.g., RMSProp, Adam, KFAC, equilibrated SGD, batch normalization, etc., are special cases of or closely related to either the Newton type or the Fisher type ones. Experimental results on relatively large scale machine learning problems are reported for performance study.
Year
Venue
Field
2019
international conference on learning representations
Applied mathematics,Lie group,Mathematical optimization,Stochastic gradient descent,Gradient descent,Normalization (statistics),Preconditioner,Matrix (mathematics),Fisher information,Mathematics,Newton's method
DocType
Citations 
PageRank 
Conference
0
0.34
References 
Authors
0
1
Name
Order
Citations
PageRank
Xi-Lin Li154734.85