Title
Expansive and Competitive Learning for Vector Quantization
Abstract
In this paper, we develop a necessary and sufficient condition for a local minimum to be a global minimum to the vector quantization problem and present a competitive learning algorithm based on this condition which has two learning terms; the first term regulates the force of attraction between the synaptic weight vectors and the input patterns in order to reach a local minimum while the second term regulates the repulsion between the synaptic weight vectors and the input's gravity center to favor convergence to the global minimum This algorithm leads to optimal or near optimal solutions and it allows the network to escape from local minima during training. Experimental results in image compression demonstrate that it outperforms the simple competitive learning algorithm, giving better codebooks.
Year
DOI
Venue
2002
10.1023/A:1015785501885
Neural Processing Letters
Keywords
Field
DocType
competitive learning,global search,image compression,neural networks,vector quantization
Convergence (routing),Competitive learning,Algorithm,Maxima and minima,Vector quantization,Artificial intelligence,Artificial neural network,Synaptic weight,Mathematics,Machine learning,Image compression,Codebook
Journal
Volume
Issue
ISSN
15
3
1573-773X
Citations 
PageRank 
References 
5
0.48
9
Authors
4