Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Person Re-identification by Contour Sketch under Moderate Clothing Change (2002.02295v1)

Published 6 Feb 2020 in cs.CV

Abstract: Person re-identification (re-id), the process of matching pedestrian images across different camera views, is an important task in visual surveillance. Substantial development of re-id has recently been observed, and the majority of existing models are largely dependent on color appearance and assume that pedestrians do not change their clothes across camera views. This limitation, however, can be an issue for re-id when tracking a person at different places and at different time if that person (e.g., a criminal suspect) changes his/her clothes, causing most existing methods to fail, since they are heavily relying on color appearance and thus they are inclined to match a person to another person wearing similar clothes. In this work, we call the person re-id under clothing change the "cross-clothes person re-id". In particular, we consider the case when a person only changes his clothes moderately as a first attempt at solving this problem based on visible light images; that is we assume that a person wears clothes of a similar thickness, and thus the shape of a person would not change significantly when the weather does not change substantially within a short period of time. We perform cross-clothes person re-id based on a contour sketch of person image to take advantage of the shape of the human body instead of color information for extracting features that are robust to moderate clothing change. Due to the lack of a large-scale dataset for cross-clothes person re-id, we contribute a new dataset that consists of 33698 images from 221 identities. Our experiments illustrate the challenges of cross-clothes person re-id and demonstrate the effectiveness of our proposed method.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Qize Yang (16 papers)
  2. Ancong Wu (19 papers)
  3. Wei-Shi Zheng (148 papers)
Citations (172)

Summary

Person Re-identification by Contour Sketch under Moderate Clothing Change

The paper "Person Re-identification by Contour Sketch under Moderate Clothing Change" addresses the significant challenge of person re-identification (re-id) when individuals change clothing across different camera views. Traditional re-id systems primarily depend on color appearance to identify individuals, an approach that becomes unreliable when clothing changes occur. To tackle this limitation, Yang et al. propose a novel technique leveraging the contour sketches of individuals for re-id, aiming to extract robust features independent of clothing variations.

The authors introduce a learning-based spatial polar transformation (SPT) incorporated into a deep neural network framework. This transformation captures discriminative features from contour sketches of person images, which are inherently more stable under clothing changes than color-based features. The SPT operates by transforming contour sketches into a polar coordinate space, thereby allowing the neural network to extract reliable angle-specific features. An angle-specific extractor (ASE) further refines these features by focusing on fine-grained angle-specific information, enhancing the discriminative power of the model.

A crucial component of the research is the proposal of a multistream network architecture. This architecture is designed to aggregate multi-granularity features by varying the sampling range of SPT, allowing the model to capture both coarse and fine-grained feature details effectively. Such a structure is vital for addressing the significant intraclass variability introduced by clothing changes.

The authors acknowledge the lack of suitable datasets for cross-clothes person re-id and contribute the PRCC dataset, which includes 33,698 images from 221 identities. This dataset allows robust evaluation of the proposed method under controlled clothing changes and provides a resource for future research in this domain.

Experimental results highlight the effectiveness of the proposed contour-sketch-based method, demonstrating superior performance on cross-clothes re-id tasks compared to color-based traditional and deep learning methods. Notably, the method maintains robustness even when combined with other re-id challenges, such as changes in view angle and partial occlusion. The contour sketch approach diverges markedly from conventional methods by not relying on color cues, therefore potentially revolutionizing the re-id paradigm when clothing variations are present.

Overall, the paper presents a comprehensive paper that combines advanced neural network techniques with a novel approach to stabilize person re-id under clothing changes. The research's implications extend to more reliable surveillance systems and improved accuracy in identifying individuals across diverse scenarios, even when they attempt to evade detection by altering their appearance. Looking forward, it opens avenues for integrating multimodal data sources further to enhance the robustness and applicability of re-id systems in practical environments.