Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Learning Multi-Attention Context Graph for Group-Based Re-Identification (2104.14236v1)

Published 29 Apr 2021 in cs.CV

Abstract: Learning to re-identify or retrieve a group of people across non-overlapped camera systems has important applications in video surveillance. However, most existing methods focus on (single) person re-identification (re-id), ignoring the fact that people often walk in groups in real scenarios. In this work, we take a step further and consider employing context information for identifying groups of people, i.e., group re-id. We propose a novel unified framework based on graph neural networks to simultaneously address the group-based re-id tasks, i.e., group re-id and group-aware person re-id. Specifically, we construct a context graph with group members as its nodes to exploit dependencies among different people. A multi-level attention mechanism is developed to formulate both intra-group and inter-group context, with an additional self-attention module for robust graph-level representations by attentively aggregating node-level features. The proposed model can be directly generalized to tackle group-aware person re-id using node-level representations. Meanwhile, to facilitate the deployment of deep learning models on these tasks, we build a new group re-id dataset that contains more than 3.8K images with 1.5K annotated groups, an order of magnitude larger than existing group re-id datasets. Extensive experiments on the novel dataset as well as three existing datasets clearly demonstrate the effectiveness of the proposed framework for both group-based re-id tasks. The code is available at https://github.com/daodaofr/group_reid.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (9)
  1. Yichao Yan (48 papers)
  2. Jie Qin (68 papers)
  3. Bingbing Ni (95 papers)
  4. Jiaxin Chen (55 papers)
  5. Li Liu (311 papers)
  6. Fan Zhu (44 papers)
  7. Wei-Shi Zheng (148 papers)
  8. Xiaokang Yang (210 papers)
  9. Ling Shao (244 papers)
Citations (37)

Summary

We haven't generated a summary for this paper yet.