Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Feature Learning based Deep Supervised Hashing with Pairwise Labels (1511.03855v2)

Published 12 Nov 2015 in cs.LG and cs.CV

Abstract: Recent years have witnessed wide application of hashing for large-scale image retrieval. However, most existing hashing methods are based on hand-crafted features which might not be optimally compatible with the hashing procedure. Recently, deep hashing methods have been proposed to perform simultaneous feature learning and hash-code learning with deep neural networks, which have shown better performance than traditional hashing methods with hand-crafted features. Most of these deep hashing methods are supervised whose supervised information is given with triplet labels. For another common application scenario with pairwise labels, there have not existed methods for simultaneous feature learning and hash-code learning. In this paper, we propose a novel deep hashing method, called deep pairwise-supervised hashing(DPSH), to perform simultaneous feature learning and hash-code learning for applications with pairwise labels. Experiments on real datasets show that our DPSH method can outperform other methods to achieve the state-of-the-art performance in image retrieval applications.

Deep Pairwise-Supervised Hashing (DPSH) for Image Retrieval

The paper "Feature Learning based Deep Supervised Hashing with Pairwise Labels" introduces a novel method called Deep Pairwise-Supervised Hashing (DPSH) aimed at enhancing image retrieval through supervised hashing with pairwise labels. DPSH is portrayed as an end-to-end learning framework that surpasses traditional hashing techniques, notably in scenarios involving pairwise labels, where previous methodologies have been sparse.

Core Contributions

DPSH provides a robust framework comprising three integrated components: a deep neural network for feature learning, a hash function for mapping these features to hash codes, and a loss function optimizing the hash codes via pairwise label guidance. This structure facilitates mutual feedback among components, thus enhancing the quality of learned representations.

  1. End-to-End Learning: DPSH leverages a deep architecture that allows simultaneous learning of features and hash codes, contrasting with existing methods like CNNH that segregate these processes into distinct stages.
  2. Superior Performance: Empirical evaluations on datasets such as CIFAR-10 and NUS-WIDE illustrate that DPSH achieves state-of-the-art results, markedly outperforming both traditional hand-crafted feature approaches and other deep learning counterparts.
  3. Pairwise Label Focus: Unlike many supervised methods depending on triplet labels, DPSH specifically addresses applications with pairwise labels, filling a notable gap in existing literature.

Detailed Insights

Feature Learning Component:

DPSH employs a CNN architecture for feature extraction, instantiated similarly to the CNN-F model but capable of adaptation to other CNN configurations. The integration with hash function learning is streamlined within a single deep network, allowing each stage to inform and refine the others.

Learning and Optimization:

The paper introduces a discrete optimization strategy to address the hashing problem, significantly diverging from the continuous relaxations typically seen in previous works like LFH. An alternating minimization approach facilitates efficient parameter updates through backpropagation, ensuring robust convergence properties.

Numerical Results and Implications

The experimental results compellingly demonstrate DPSH's efficacy, reporting substantial gains in MAP values across various retrieval tasks compared to other methods, including non-deep and CNN-based alternatives. Notably, DPSH consistently outperforms deep hashing models relying on triplet labels, highlighting the advantages of its pairwise-focused learning.

Theoretical and Practical Implications

Theoretically, DPSH advances the understanding of integrating feature and hash-code learning, particularly within the context of pairwise labels. Practically, its application can revolutionize large-scale image retrieval systems across domains requiring rapid and precise similarity computations.

Future Directions

Future work may explore broader CNN architectures within DPSH's framework to assess impacts on different data distributions. Further refinements in the discrete optimization technique could also yield even higher performance benchmarks. Addressing more diverse types of label configurations could additionally extend DPSH’s applicability.

In conclusion, the DPSH model provides a significant contribution to the field of image retrieval, presenting a meticulously designed hashing process enhanced by deep learning and pairwise evaluation metrics. Its potential extends to a variety of applications demanding optimized search efficiency and accuracy, solidifying its status as a notable development in supervised hashing methodologies.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Wu-Jun Li (57 papers)
  2. Sheng Wang (239 papers)
  3. Wang-Cheng Kang (16 papers)
Citations (643)