Approximate Nearest Neighbor Searching with Non-Euclidean and Weighted Distances

Abstract : We present a new approach to ε-approximate nearest-neighbor queries in fixed dimension under a variety of non-Euclidean distances. We consider two families of distance functions: (a) convex scaling distance functions including the Mahalanobis distance, the Minkowski metric and multiplicative weights, and (b) Bregman divergences including the Kullback-Leibler divergence and the Itakura-Saito distance. As the fastest known data structures rely on the lifting transformation, their application is limited to the Euclidean metric, and alternative approaches for other distance functions are much less efficient. We circumvent the reliance on the lifting transformation by a careful application of convexification, which appears to be relatively new to computational geometry. We are given n points in ℝd, each a site possibly defining its own distance function. Under mild assumptions on the growth rates of these functions, the proposed data structures answer queries in logarithmic time using O(n log(1/ε)/εd/2) space, which nearly matches the best known results for the Euclidean metric.
Document type :
Conference papers
Complete list of metadatas
Contributor : Guilherme D. da Fonseca <>
Submitted on : Tuesday, January 15, 2019 - 3:55:03 AM
Last modification on : Monday, January 20, 2020 - 12:14:05 PM

Links full text




Ahmed Abdelkader, Sunil Arya, Guilherme da Fonseca, David Mount. Approximate Nearest Neighbor Searching with Non-Euclidean and Weighted Distances. SODA 2019 - Proceedings of the Thirtieth Annual ACM-SIAM Symposium on Discrete Algorithms, Jan 2019, San Diego, United States. pp.355-372, ⟨10.1137/1.9781611975482.23⟩. ⟨hal-01981332⟩



Record views