Abstract | ||
---|---|---|
We modify the architecture of radial basis function neural networks so as to also model linear as well as the usual nonlinear input–output relationships. The resulting network learns with fewer iterations and is more accurate than radial basis function neural networks or multiple layered perceptrons. Two training algorithms are presented for the new network: quick training and full training. The full training algorithm adjusts more parameters and requires more computation, but quick training is simpler and faster while also being very accurate. These networks, like radial basis function networks, are at least as powerful as the Takagi–Sugeno type of fuzzy rule-based systems. We compare their training results with those of multiple layered perceptrons and radial basis function neural networks on three data sets to show the advantage of our architecture and algorithm. |
Year | DOI | Venue |
---|---|---|
2002 | 10.1016/S0925-2312(01)00613-0 | Neurocomputing |
Keywords | Field | DocType |
Radial basis functions,Neural networks,Fuzzy classification | Radial basis function network,Neuro-fuzzy,Radial basis function,Fuzzy classification,Activation function,Algorithm,Artificial intelligence,Artificial neural network,Perceptron,Mathematics,Machine learning,Fuzzy rule | Journal |
Volume | Issue | ISSN |
48 | 1 | 0925-2312 |
Citations | PageRank | References |
25 | 1.97 | 37 |
Authors | ||
1 |
Name | Order | Citations | PageRank |
---|---|---|---|
Carl G. Looney | 1 | 198 | 21.58 |