Title
Vector Quantization by Minimizing Kullback-Leibler Divergence.
Abstract
This paper proposes a new method for vector quantization by minimizing the Kullback-Leibler Divergence between the class label distributions over the quantization inputs, which are original vectors, and the output, which is the quantization subsets of the vector set. In this way, the vector quantization output can keep as much information of the class label as possible. An objective function is constructed and we also developed an iterative algorithm to minimize it. The new method is evaluated on bag-of-features based image classification problem.
Year
Venue
Field
2015
CoRR
Divergence,Linde–Buzo–Gray algorithm,Pattern recognition,Iterative method,Computer science,Learning vector quantization,Vector quantization,Artificial intelligence,Quantization (image processing),Quantization (signal processing),Kullback–Leibler divergence,Machine learning
DocType
Volume
Citations 
Journal
abs/1501.07681
0
PageRank 
References 
Authors
0.34
22
5
Name
Order
Citations
PageRank
Lan Yang135.54
Jingbin Wang212.03
Yujin Tu310.68
Prarthana Mahapatra400.34
Nelson Cardoso500.34