Abstract | ||
---|---|---|
Being two famous neural networks, the error back-propagation (BP) algorithm based neural networks (i.e., BP-type neural networks, BPNNs) and Hopfield-type neural networks (HNNs) have been proposed, developed, and investigated extensively for scientific research and engineering applications. They are different from each other in a great deal, in terms of network architecture, physical meaning and training pattern. In this paper of literature-review type, we present in a relatively complete and creative manner the common natures of learning between BP-type and Hopfield-type neural networks for solving various (mathematical) problems. Specifically, comparing the BPNN with the HNN for the same problem-solving task, e.g., matrix inversion as well as function approximation, we show that the BPNN weight-updating formula and the HNN state-transition equation turn out to be essentially the same. Such interesting phenomena promise that, given a neural-network model for a specific problem solving, its potential dual neural-network model can thus be developed. |
Year | DOI | Venue |
---|---|---|
2015 | 10.1016/j.neucom.2015.04.032 | Neurocomputing |
Keywords | Field | DocType |
Neural networks,Common nature of learning,BP-type,Hopfield-type,Problem solving | Function approximation,Physical neural network,Network architecture,Time delay neural network,Types of artificial neural networks,Artificial intelligence,Deep learning,Artificial neural network,Cellular neural network,Machine learning,Mathematics | Journal |
Volume | Issue | ISSN |
167 | C | 0925-2312 |
Citations | PageRank | References |
10 | 0.55 | 30 |
Authors | ||
5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Dongsheng Guo | 1 | 22 | 2.12 |
Yunong Zhang | 2 | 2344 | 162.43 |
Zhengli Xiao | 3 | 43 | 3.17 |
Mingzhi Mao | 4 | 61 | 7.82 |
Jianxi Liu | 5 | 10 | 0.55 |