Title | ||
---|---|---|
Adaptive Natural Gradient Method for Learning Neural Networks with Large Data set in Mini-Batch Mode |
Abstract | ||
---|---|---|
Natural gradient learning, which is one of gradient descent learning methods, is known to have ideal convergence properties in the learning of hierarchical machines such as layered neural networks. However, there are a few limitations that degrades its practical usability: necessity of true probability density function of input variables and heavy computational cost due to matrix inversion. Though its adaptive approximation have been developed, it is basically derived for online learning mode, in which a single update is done for a single data sample. Noting that the on-line learning mode is not appropriate for the tasks with huge number of training data, this paper proposes a practical implementation of natural gradient for mini-batch learning mode, which is the most common setting in the real application with large data set. Computational experiments on benchmark datasets shows the efficiency of the proposed methods. |
Year | DOI | Venue |
---|---|---|
2019 | 10.1109/ICAIIC.2019.8669082 | 2019 International Conference on Artificial Intelligence in Information and Communication (ICAIIC) |
Keywords | Field | DocType |
Neural networks,Learning systems,Convergence,Computational efficiency,Training data,Standards,Time series analysis | Convergence (routing),Time series,Gradient descent,Matrix (mathematics),Computer science,Usability,Algorithm,Batch processing,Artificial neural network,Probability density function | Conference |
ISBN | Citations | PageRank |
978-1-5386-7822-0 | 0 | 0.34 |
References | Authors | |
0 | 2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Hyeyoung Park | 1 | 194 | 32.70 |
Kwanyong Lee | 2 | 13 | 4.38 |