Abstract | ||
---|---|---|
In this paper, we develop a necessary and sufficient condition for a local minimum to be a global minimum to the vector quantization problem and present a competitive learning algorithm based on this condition which has two learning terms; the first term regulates the force of attraction between the synaptic weight vectors and the input patterns in order to reach a local minimum while the second term regulates the repulsion between the synaptic weight vectors and the input's gravity center to favor convergence to the global minimum This algorithm leads to optimal or near optimal solutions and it allows the network to escape from local minima during training. Experimental results in image compression demonstrate that it outperforms the simple competitive learning algorithm, giving better codebooks. |
Year | DOI | Venue |
---|---|---|
2002 | 10.1023/A:1015785501885 | Neural Processing Letters |
Keywords | Field | DocType |
competitive learning,global search,image compression,neural networks,vector quantization | Convergence (routing),Competitive learning,Algorithm,Maxima and minima,Vector quantization,Artificial intelligence,Artificial neural network,Synaptic weight,Mathematics,Machine learning,Image compression,Codebook | Journal |
Volume | Issue | ISSN |
15 | 3 | 1573-773X |
Citations | PageRank | References |
5 | 0.48 | 9 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
J. Muñoz-Perez | 1 | 8 | 1.65 |
José Antonio Gómez-Ruiz | 2 | 88 | 12.47 |
Ezequiel López-Rubio | 3 | 21 | 5.48 |
M. Angeles García-Bernal | 4 | 6 | 0.85 |