Title
Fast nearest neighbor retrieval for bregman divergences
Abstract
We present a data structure enabling efficient nearest neighbor (NN) retrieval for bregman divergences. The family of bregman divergences includes many popular dissimilarity measures including KL-divergence (relative entropy), Mahalanobis distance, and Itakura-Saito divergence. These divergences present a challenge for efficient NN retrieval because they are not, in general, metrics, for which most NN data structures are designed. The data structure introduced in this work shares the same basic structure as the popular metric ball tree, but employs convexity properties of bregman divergences in place of the triangle inequality. Experiments demonstrate speedups over brute-force search of up to several orders of magnitude.
Year
DOI
Venue
2008
10.1145/1390156.1390171
ICML
Keywords
Field
DocType
basic structure,popular dissimilarity,efficient nn retrieval,popular metric ball tree,mahalanobis distance,itakura-saito divergence,bregman divergence,efficient nearest neighbor,nn data structure,data structure,nearest neighbor retrieval,nearest neighbor,relative entropy
k-nearest neighbors algorithm,Data structure,Pattern recognition,Ball tree,Computer science,Mahalanobis distance,Artificial intelligence,Bregman divergence,Triangle inequality,Nearest neighbor search,Kullback–Leibler divergence,Machine learning
Conference
Citations 
PageRank 
References 
36
1.38
17
Authors
1
Name
Order
Citations
PageRank
Lawrence Cayton11528.27