Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Learning of Time-Frequency Attention Mechanism for Automatic Modulation Recognition (2111.03258v2)

Published 5 Nov 2021 in eess.SP and cs.CV

Abstract: Recent learning-based image classification and speech recognition approaches make extensive use of attention mechanisms to achieve state-of-the-art recognition power, which demonstrates the effectiveness of attention mechanisms. Motivated by the fact that the frequency and time information of modulated radio signals are crucial for modulation mode recognition, this paper proposes a time-frequency attention mechanism for a convolutional neural network (CNN)-based modulation recognition framework. The proposed time-frequency attention module is designed to learn which channel, frequency and time information is more meaningful in CNN for modulation recognition. We analyze the effectiveness of the proposed time-frequency attention mechanism and compare the proposed method with two existing learning-based methods. Experiments on an open-source modulation recognition dataset show that the recognition performance of the proposed framework is better than those of the framework without time-frequency attention and existing learning-based methods.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Shangao Lin (2 papers)
  2. Yuan Zeng (9 papers)
  3. Yi Gong (53 papers)
Citations (50)

Summary

We haven't generated a summary for this paper yet.