Abstract | ||
---|---|---|
k Nearest Neighbors (k-NN) search is a widely used category of algorithms with applications in domains such as computer vision and machine learning. Despite the desire to process increasing amounts of high-dimensional data within these domains, k-NN algorithms scale poorly on multicore systems because they hit a memory wall. In this paper, we propose a novel data filtering strategy for k-NN search algorithms on multicore platforms. By excluding unlikely features during the k-NN search process, this strategy can reduce the amount of computation required as well as the memory footprint. It is complementary to the data selection strategies used in other state-of-the-art k-NN algorithms. A Subspace Clustering for Filtering (SCF) method is proposed to implement the data filtering strategy. Experimental results on four k-NN algorithms show that SCF can significantly improve their performance on three modern multicore platforms with only a small loss of search precision. |
Year | DOI | Venue |
---|---|---|
2015 | 10.1109/TPDS.2014.2372755 | Parallel and Distributed Systems, IEEE Transactions |
Keywords | Field | DocType |
high-dimensional space,memory wall,multicore systems,scalability,subspace clustering for filtering,k nearest neighbors,approximation algorithms,estimation,multicore processing,clustering algorithms,filtering | k-nearest neighbors algorithm,Approximation algorithm,Search algorithm,Computer science,Filter (signal processing),Theoretical computer science,Memory footprint,Cluster analysis,Multi-core processor,Scalability | Journal |
Volume | Issue | ISSN |
PP | 99 | 1045-9219 |
Citations | PageRank | References |
3 | 0.39 | 24 |
Authors | ||
5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Xiaoxin Tang | 1 | 9 | 2.18 |
Zhiyi Huang | 2 | 91 | 19.14 |
David M. Eyers | 3 | 477 | 45.90 |
Steven Mills | 4 | 41 | 17.74 |
Minyi Guo | 5 | 3969 | 332.25 |