Title
Common nature of learning between BP-type and Hopfield-type neural networks
Abstract
Being two famous neural networks, the error back-propagation (BP) algorithm based neural networks (i.e., BP-type neural networks, BPNNs) and Hopfield-type neural networks (HNNs) have been proposed, developed, and investigated extensively for scientific research and engineering applications. They are different from each other in a great deal, in terms of network architecture, physical meaning and training pattern. In this paper of literature-review type, we present in a relatively complete and creative manner the common natures of learning between BP-type and Hopfield-type neural networks for solving various (mathematical) problems. Specifically, comparing the BPNN with the HNN for the same problem-solving task, e.g., matrix inversion as well as function approximation, we show that the BPNN weight-updating formula and the HNN state-transition equation turn out to be essentially the same. Such interesting phenomena promise that, given a neural-network model for a specific problem solving, its potential dual neural-network model can thus be developed.
Year
DOI
Venue
2015
10.1016/j.neucom.2015.04.032
Neurocomputing
Keywords
Field
DocType
Neural networks,Common nature of learning,BP-type,Hopfield-type,Problem solving
Function approximation,Physical neural network,Network architecture,Time delay neural network,Types of artificial neural networks,Artificial intelligence,Deep learning,Artificial neural network,Cellular neural network,Machine learning,Mathematics
Journal
Volume
Issue
ISSN
167
C
0925-2312
Citations 
PageRank 
References 
10
0.55
30
Authors
5
Name
Order
Citations
PageRank
Dongsheng Guo1222.12
Yunong Zhang22344162.43
Zhengli Xiao3433.17
Mingzhi Mao4617.82
Jianxi Liu5100.55