Emergent Mind

Efficient and robust approximate nearest neighbor search using Hierarchical Navigable Small World graphs

Published Mar 30, 2016 in cs.DS , cs.CV , cs.IR , and cs.SI


We present a new approach for the approximate K-nearest neighbor search based on navigable small world graphs with controllable hierarchy (Hierarchical NSW, HNSW). The proposed solution is fully graph-based, without any need for additional search structures, which are typically used at the coarse search stage of the most proximity graph techniques. Hierarchical NSW incrementally builds a multi-layer structure consisting from hierarchical set of proximity graphs (layers) for nested subsets of the stored elements. The maximum layer in which an element is present is selected randomly with an exponentially decaying probability distribution. This allows producing graphs similar to the previously studied Navigable Small World (NSW) structures while additionally having the links separated by their characteristic distance scales. Starting search from the upper layer together with utilizing the scale separation boosts the performance compared to NSW and allows a logarithmic complexity scaling. Additional employment of a heuristic for selecting proximity graph neighbors significantly increases performance at high recall and in case of highly clustered data. Performance evaluation has demonstrated that the proposed general metric space search index is able to strongly outperform previous opensource state-of-the-art vector-only approaches. Similarity of the algorithm to the skip list structure allows straightforward balanced distributed implementation.


  • The paper introduces the Hierarchical Navigable Small World (HNSW) method for Approximate Nearest Neighbor Search (ANNS) that scales efficiently with high-dimensional data.

  • HNSW employs a multi-layered graph structure, innovative use of heuristics for neighbor selection, and differentiates links by distance scales to improve search efficiency.

  • Performance evaluations highlight HNSW's superior capabilities in vector spaces over existing state-of-the-art approaches, owing to its algorithmic similarity to the Skip List structure.

  • HNSW's potential for broad AI applications is discussed, especially in unsupervised and deep learning, though its distributed search capabilities present a limitation for future research.


In the landscape of information retrieval, the K-Nearest Neighbor Search (K-NNS) stands out for its critical role across various applications, including machine learning, image recognition, and semantic document retrieval. Given a set distance function, K-NNS aims to identify the K closest dataset elements to any given query. Traditional methods, although effective in smaller scales, falter when faced with high-dimensional data, as they succumb to the so-called "curse of dimensionality." This presents an ongoing challenge: developing a method that offers both scalability and efficiency, particularly in approximating nearest neighbors under such circumstances.

The Hierarchical Navigable Small World (HNSW) Approach

A significant advancement is presented through the Hierarchical Navigable Small World (HNSW) approach, which introduces a graph-based method for Approximate Nearest Neighbor Search (ANNS). Unlike traditional proximity graph techniques that endure performance degradation with high-dimensional or clustered data, HNSW constructs a multi-layer structure consisting of hierarchical graphs for nested dataset subsets, thereby facilitating a logarithmic complexity scaling in search operations.

Key innovations of HNSW include:

  • A multi-layered graph structure that allocates elements across different layers, with the allocation determined by an exponentially decaying probability distribution.

  • The separation of links by characteristic distance scales, enhancing efficiency by initiating searches from higher layers.

  • The use of advanced heuristics for neighbor selection within proximity graphs, significantly improving performance in scenarios of high recall and clustered data.

Performance evaluations have solidly confirmed HNSW's superior capabilities over existing state-of-the-art approaches, particularly in handling vector spaces. The similarity of HNSW's algorithmic nature to the Skip List structure also simplifies its implementation for balanced distributed systems.

Implications and Speculations on Future AI Developments

The HNSW model showcases a leap in addressing the challenges posed by the "curse of dimensionality" in large-scale datasets. Its efficient indexing and retrieval capabilities not only make it an excellent choice for real-world applications requiring high precision and recall but also pave the way for new explorations in the field of AI, especially in unsupervised learning and deep learning frameworks where efficient data retrieval is critical.

Moreover, the ability of HNSW to perform in generalized metric spaces opens up possibilities for its application beyond vector spaces, such as in semantic search, where complex structures and relationships within data can be navigated more intuitively.

However, it's essential to recognize the limitations, particularly in terms of distributed search capabilities. While HNSW enhances performance and scalability, it diverges from the decentralized construction potential of its NSW predecessor due to its top-layer search initiation. Future research might explore methods to reconcile these aspects, potentially enhancing the distributed performance further.


HNSW presents a robust, scalable solution to the K-ANNS problem, especially within high-dimensional datasets. It stands as a testament to the evolving landscape of information retrieval, where efficiency and accuracy are paramount. As AI and machine learning applications continue to expand into increasingly complex data spaces, the principles and performances exhibited by approaches like HNSW will undoubtedly serve as critical foundations. Future advancements may well rest on these efficient data retrieval mechanisms, driving forward the capabilities and applications of AI technologies.

Get summaries of trending AI papers delivered straight to your inbox

Unsubscribe anytime.

Test Your Knowledge

You answered out of questions correctly.

Well done!