Title
Hybrid sampling on mutual information entropy-based clustering ensembles for optimizations
Abstract
In this paper, we focus on the design of bivariate EDAs for discrete optimization problems and propose a new approach named HSMIEC. While the current EDAs require much time in the statistical learning process as the relationships among the variables are too complicated, we employ the Selfish gene theory (SG) in this approach, as well as a Mutual Information and Entropy based Cluster (MIEC) model is also set to optimize the probability distribution of the virtual population. This model uses a hybrid sampling method by considering both the clustering accuracy and clustering diversity and an incremental learning and resample scheme is also set to optimize the parameters of the correlations of the variables. Compared with several benchmark problems, our experimental results demonstrate that HSMIEC often performs better than some other EDAs, such as BMDA, COMIT, MIMIC and ECGA.
Year
DOI
Venue
2010
10.1016/j.neucom.2009.11.011
Neurocomputing
Keywords
Field
DocType
bivariate edas,mutual information,current edas,new approach,selfish gene theory,clustering accuracy,statistical learning process,incremental learning,estimation of distribution algorithm,clustering ensembles,benchmark problem,clustering diversity,mutual information entropy,hybrid sampling,probability distribution,sampling methods,discrete optimization
EDAS,Population,Pattern recognition,Estimation of distribution algorithm,Probability distribution,Artificial intelligence,Mutual information,Sampling (statistics),Cluster analysis,Bivariate analysis,Mathematics,Machine learning
Journal
Volume
Issue
ISSN
73
7-9
Neurocomputing
Citations 
PageRank 
References 
3
0.43
19
Authors
5
Name
Order
Citations
PageRank
Feng Wang119519.03
Cheng Yang263162.94
Zhiyi Lin3282.10
Yuanxiang Li424551.20
Yuan Yuan54095175.76