Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Stripe-based and Attribute-aware Network: A Two-Branch Deep Model for Vehicle Re-identification (1910.05549v1)

Published 12 Oct 2019 in cs.CV

Abstract: Vehicle re-identification (Re-ID) has been attracting increasing interest in the field of computer vision due to the growing utilization of surveillance cameras in public security. However, vehicle Re-ID still suffers a similarity challenge despite the efforts made to solve this problem. This challenge involves distinguishing different instances with nearly identical appearances. In this paper, we propose a novel two-branch stripe-based and attribute-aware deep convolutional neural network (SAN) to learn the efficient feature embedding for vehicle Re-ID task. The two-branch neural network, consisting of stripe-based branch and attribute-aware branches, can adaptively extract the discriminative features from the visual appearance of vehicles. A horizontal average pooling and dimension-reduced convolutional layers are inserted into the stripe-based branch to achieve part-level features. Meanwhile, the attribute-aware branch extracts the global feature under the supervision of vehicle attribute labels to separate the similar vehicle identities with different attribute annotations. Finally, the part-level and global features are concatenated together to form the final descriptor of the input image for vehicle Re-ID. The final descriptor not only can separate vehicles with different attributes but also distinguish vehicle identities with the same attributes. The extensive experiments on both VehicleID and VeRi databases show that the proposed SAN method outperforms other state-of-the-art vehicle Re-ID approaches.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Jingjing Qian (4 papers)
  2. Wei Jiang (343 papers)
  3. Hao Luo (112 papers)
  4. Hongyan Yu (5 papers)
Citations (93)

Summary

We haven't generated a summary for this paper yet.