Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Text Steganalysis with Attentional LSTM-CNN (1912.12871v2)

Published 30 Dec 2019 in cs.MM

Abstract: With the rapid development of NLP technologies, text steganography methods have been significantly innovated recently, which poses a great threat to cybersecurity. In this paper, we propose a novel attentional LSTM-CNN model to tackle the text steganalysis problem. The proposed method firstly maps words into semantic space for better exploitation of the semantic feature in texts and then utilizes a combination of Convolutional Neural Networks (CNNs) and Long Short-Term Memory (LSTM) recurrent neural networks to capture both local and long-distance contextual information in steganography texts. In addition, we apply attention mechanism to recognize and attend to important clues within suspicious sentences. After merge feature clues from Convolutional Neural Networks (CNNs) and Recurrent Neural Networks (RNNs), we use a softmax layer to categorize the input text as cover or stego. Experiments showed that our model can achieve the state-of-art result in the text steganalysis task.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. YongJian Bao (4 papers)
  2. Hao Yang (328 papers)
  3. Zhongliang Yang (33 papers)
  4. Sheng Liu (122 papers)
  5. Yongfeng Huang (110 papers)
Citations (7)