Abstract | ||
---|---|---|
A leaders set which is derived using the leaders clustering method can be used in place of a large training set to reduce the computational burden of a classifier. Recently, a fast and efficient leader-based classifier called weighted k-nearest leader-based classifier is shown by us to be an efficient and faster classifier. But, there exist some uncertainty while calculating the relative importance (weight) of the prototypes. This paper proposes a generalization over the earlier proposed k-nearest leader-based classifier where a novel soft computing approach is used to resolve the uncertainty. Combined principles of rough set theory and fuzzy set theory are used to analyze the proposed method. The proposed method called rough-fuzzy weighted k-nearest leader classifier (RF-wk-NLC) uses a two level hierarchy of prototypes along with their relative importance. RF-wk-NLC is shown by using some standard data sets to have improved performance and is compared with the earlier related methods. |
Year | DOI | Venue |
---|---|---|
2009 | 10.1016/j.patcog.2008.11.021 | Pattern Recognition |
Keywords | Field | DocType |
leaders–subleaders,faster classifier,bayes classifier and rf-wk-nlc,efficient leader-based classifier,related method,fuzzy set theory,large data set,rough set theory,proposed k-nearest leader-based classifier,k -nnc,leaders set,relative importance,k-nearest leader-based classifier,rough-fuzzy weighted k-nearest leader,rough-fuzzy sets,soft computing,fuzzy set,bayes classifier | Data mining,Margin (machine learning),Fuzzy set,Artificial intelligence,Soft computing,Classifier (linguistics),Pattern recognition,Fuzzy logic,Rough set,Margin classifier,Machine learning,Mathematics,Quadratic classifier | Journal |
Volume | Issue | ISSN |
42 | 9 | Pattern Recognition |
Citations | PageRank | References |
9 | 0.57 | 27 |
Authors | ||
2 |
Name | Order | Citations | PageRank |
---|---|---|---|
V. Suresh Babu | 1 | 38 | 4.00 |
P. Viswanath | 2 | 148 | 11.77 |