Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Deep Unsupervised Hashing by Distilled Smooth Guidance (2105.06125v1)

Published 13 May 2021 in cs.CV

Abstract: Hashing has been widely used in approximate nearest neighbor search for its storage and computational efficiency. Deep supervised hashing methods are not widely used because of the lack of labeled data, especially when the domain is transferred. Meanwhile, unsupervised deep hashing models can hardly achieve satisfactory performance due to the lack of reliable similarity signals. To tackle this problem, we propose a novel deep unsupervised hashing method, namely Distilled Smooth Guidance (DSG), which can learn a distilled dataset consisting of similarity signals as well as smooth confidence signals. To be specific, we obtain the similarity confidence weights based on the initial noisy similarity signals learned from local structures and construct a priority loss function for smooth similarity-preserving learning. Besides, global information based on clustering is utilized to distill the image pairs by removing contradictory similarity signals. Extensive experiments on three widely used benchmark datasets show that the proposed DSG consistently outperforms the state-of-the-art search methods.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Xiao Luo (112 papers)
  2. Zeyu Ma (20 papers)
  3. Daqing Wu (5 papers)
  4. Huasong Zhong (9 papers)
  5. Chong Chen (122 papers)
  6. Jinwen Ma (31 papers)
  7. Minghua Deng (15 papers)
Citations (2)

Summary

We haven't generated a summary for this paper yet.