Abstract | ||
---|---|---|
Hub ness is a recently described aspect of the curse of dimensionality inherent to nearest-neighbor methods. In this paper we present a new approach for exploiting the hub ness phenomenon in k-nearest neighbor classification. We argue that some of the neighbor occurrences carry more information than others, by the virtue of being less frequent events. This observation is related to the hub ness phenomenon and we explore how it affects high-dimensional k-nearest neighbor classification. We propose a new algorithm, Hub ness Information k-Nearest Neighbor (HIKNN), which introduces the k-occurrence informativeness into the hub ness-aware k-nearest neighbor voting framework. Our evaluation on high-dimensional data shows significant improvements over both the basic k-nearest neighbor approach and all previously used hub ness-aware approaches. |
Year | DOI | Venue |
---|---|---|
2012 | 10.1109/ICDMW.2011.127 | Comput. Sci. Inf. Syst. |
Keywords | DocType | Volume |
hub ness,high-dimensional data,nearest neighbor voting,past occurrences,hub ness-aware approach,k-nearest neighbor classification,new algorithm,hub ness-aware k-nearest neighbor,neighbor occurrence,high-dimensional k-nearest neighbor classification,hub ness phenomenon,basic k-nearest neighbor approach,high dimensional data,curse of dimensionality,k nearest neighbor,nearest neighbor method,classification,nearest neighbor,knn,data mining | Journal | 9 |
Issue | Citations | PageRank |
2 | 21 | 0.70 |
References | Authors | |
0 | 2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Nenad Tomasev | 1 | 98 | 7.60 |
Dunja Mladenic | 2 | 1484 | 170.14 |