Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 77 tok/s
Gemini 2.5 Pro 54 tok/s Pro
GPT-5 Medium 29 tok/s Pro
GPT-5 High 26 tok/s Pro
GPT-4o 103 tok/s Pro
Kimi K2 175 tok/s Pro
GPT OSS 120B 454 tok/s Pro
Claude Sonnet 4.5 38 tok/s Pro
2000 character limit reached

K-Nearest Neighbor Approximation Via the Friend-of-a-Friend Principle (1908.07645v4)

Published 20 Aug 2019 in math.CO, math.ST, and stat.TH

Abstract: Suppose $V$ is an $n$-element set where for each $x \in V$, the elements of $V \setminus {x}$ are ranked by their similarity to $x$. The $K$-nearest neighbor graph is a directed graph including an arc from each $x$ to the $K$ points of $V \setminus {x}$ most similar to $x$. Constructive approximation to this graph using far fewer than $n2$ comparisons is important for the analysis of large high-dimensional data sets. $K$-Nearest Neighbor Descent is a parameter-free heuristic where a sequence of graph approximations is constructed, in which second neighbors in one approximation are proposed as neighbors in the next. Run times in a test case fit an $O(n K2 \log{n})$ pattern. This bound is rigorously justified for a similar algorithm, using range queries, when applied to a homogeneous Poisson process in suitable dimension. However the basic algorithm fails to achieve subquadratic complexity on sets whose similarity rankings arise from a ``generic'' linear order on the $\binom{n}{2}$ inter-point distances in a metric space.

Citations (10)

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.