Title
Gaussian Mixture Reduction Using Reverse Kullback-Leibler Divergence
Abstract
We propose a greedy mixture reduction algorithm which is capable of pruning mixture components as well as merging them based on the Kullback-Leibler divergence (KLD). The algorithm is distinct from the well-known Runnalls' KLD based method since it is not restricted to merging operations. The capability of pruning (in addition to merging) gives the algorithm the ability of preserving the peaks of the original mixture during the reduction. Analytical approximations are derived to circumvent the computational intractability of the KLD which results in a computationally efficient method. The proposed algorithm is compared with Runnalls' and Williams' methods in two numerical examples, using both simulated and real world data. The results indicate that the performance and computational complexity of the proposed approach make it an efficient alternative to existing mixture reduction methods.
Year
Venue
Field
2015
CoRR
Mathematical optimization,Bayesian inference,Fiducial inference,Bayesian linear regression,Approximate inference,Statistical inference,Artificial intelligence,Bayesian statistics,Prior probability,Conjugate prior,Machine learning,Mathematics
DocType
Volume
Citations 
Journal
abs/1508.05514
0
PageRank 
References 
Authors
0.34
4
3
Name
Order
Citations
PageRank
Tohid Ardeshiri1277.14
Umut Orguner254840.11
Emre Özkan39410.54