Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Interaction-aware Spatio-temporal Pyramid Attention Networks for Action Classification (1808.01106v1)

Published 3 Aug 2018 in cs.CV

Abstract: Local features at neighboring spatial positions in feature maps have high correlation since their receptive fields are often overlapped. Self-attention usually uses the weighted sum (or other functions) with internal elements of each local feature to obtain its weight score, which ignores interactions among local features. To address this, we propose an effective interaction-aware self-attention model inspired by PCA to learn attention maps. Furthermore, since different layers in a deep network capture feature maps of different scales, we use these feature maps to construct a spatial pyramid and then utilize multi-scale information to obtain more accurate attention scores, which are used to weight the local features in all spatial positions of feature maps to calculate attention maps. Moreover, our spatial pyramid attention is unrestricted to the number of its input feature maps so it is easily extended to a spatio-temporal version. Finally, our model is embedded in general CNNs to form end-to-end attention networks for action classification. Experimental results show that our method achieves the state-of-the-art results on the UCF101, HMDB51 and untrimmed Charades.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Yang Du (24 papers)
  2. Chunfeng Yuan (35 papers)
  3. Bing Li (374 papers)
  4. Lili Zhao (30 papers)
  5. Yangxi Li (7 papers)
  6. Weiming Hu (91 papers)
Citations (78)

Summary

We haven't generated a summary for this paper yet.