Abstract | ||
---|---|---|
A neural network model of competitive learning was proposed. In this model, output cells in the network were self-organized to represent the distribution of input pattern vectors. The self-organization was based upon a generalized energy function. The network was mathematically proved to converge to the global minimum of the energy function when the number of output cells is the same as that of input patterns. In this global minimum, a one-to-one correspondence was established between input patterns and output cells, and an output cell responded exclusively to its corresponding input pattern. The model was compared with conventional models of competitive learning or feature detection. Typical behavior of the network was demonstrated by computer simulation, which included the case of clustered input patterns. |
Year | DOI | Venue |
---|---|---|
1993 | 10.1016/S0893-6080(09)80021-X | Neural Networks |
Keywords | Field | DocType |
energy function,neural network model,input pattern,global minimum,feature detection,competitive learning,output cell,convergence,potential function,input pattern vector,corresponding input pattern,generalized energy function,conventional model | Convergence (routing),Competitive learning,Feature detection,Artificial intelligence,Generalized function,Artificial neural network,Mathematics | Journal |
Volume | Issue | ISSN |
6 | 8 | Neural Networks |
Citations | PageRank | References |
4 | 0.43 | 1 |
Authors | ||
1 |
Name | Order | Citations | PageRank |
---|---|---|---|
Tadashi Masuda | 1 | 4 | 0.43 |