Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Attention-based Temporal Weighted Convolutional Neural Network for Action Recognition (1803.07179v1)

Published 19 Mar 2018 in cs.CV

Abstract: Research in human action recognition has accelerated significantly since the introduction of powerful machine learning tools such as Convolutional Neural Networks (CNNs). However, effective and efficient methods for incorporation of temporal information into CNNs are still being actively explored in the recent literature. Motivated by the popular recurrent attention models in the research area of natural language processing, we propose the Attention-based Temporal Weighted CNN (ATW), which embeds a visual attention model into a temporal weighted multi-stream CNN. This attention model is simply implemented as temporal weighting yet it effectively boosts the recognition performance of video representations. Besides, each stream in the proposed ATW framework is capable of end-to-end training, with both network parameters and temporal weights optimized by stochastic gradient descent (SGD) with backpropagation. Our experiments show that the proposed attention mechanism contributes substantially to the performance gains with the more discriminative snippets by focusing on more relevant video segments.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Jinliang Zang (4 papers)
  2. Le Wang (144 papers)
  3. Ziyi Liu (74 papers)
  4. Qilin Zhang (15 papers)
  5. Zhenxing Niu (21 papers)
  6. Gang Hua (101 papers)
  7. Nanning Zheng (146 papers)
Citations (70)

Summary

We haven't generated a summary for this paper yet.