Abstract | ||
---|---|---|
K Nearest Neighbors (k-NN) search is a widely used category of algorithms with applications in domains such as computer vision and machine learning. With the rapidly increasing amount of data available, and their high dimensionality, k-NN algorithms scale poorly on multicore systems because they hit a memory wall. In this paper, we propose a novel data filtering strategy, named Subspace Clustering for Filtering (SCF), for k-NN search algorithms on multicore platforms. By excluding unlikely features in k-NN search, this strategy can reduce memory footprint as well as computation. Experimental results on four k-NN algorithms show that SCF can improve their performance on two modern multicore platforms with insignificant loss of search precision. |
Year | DOI | Venue |
---|---|---|
2014 | 10.1145/2600212.2600710 | HPDC |
Keywords | Field | DocType |
k nearest neighbors,multicore systems,general,subspace clustering for filtering.,high-dimensional space,memory wall | k-nearest neighbors algorithm,Search algorithm,Computer science,Parallel computing,Filter (signal processing),Curse of dimensionality,Memory footprint,Multi-core processor,Computation,Scalability | Conference |
Citations | PageRank | References |
5 | 0.43 | 27 |
Authors | ||
6 |
Name | Order | Citations | PageRank |
---|---|---|---|
Xiaoxin Tang | 1 | 10 | 2.60 |
Steven Mills | 2 | 41 | 17.74 |
David M. Eyers | 3 | 477 | 45.90 |
Kai-Cheung Leung | 4 | 14 | 2.74 |
Zhiyi Huang | 5 | 91 | 19.14 |
Minyi Guo | 6 | 3969 | 332.25 |