Papers
Topics
Authors
Recent
2000 character limit reached

F3SNet: A Four-Step Strategy for QIM Steganalysis of Compressed Speech Based on Hierarchical Attention Network (2101.05105v2)

Published 13 Jan 2021 in cs.CR

Abstract: Traditional machine learning-based steganalysis methods on compressed speech have achieved great success in the field of communication security. However, previous studies lacked mathematical description and modeling of the correlation between codewords, and there is still room for improvement in steganalysis for small-sized and low embedding rates sample. To deal with the challenge, We use Bayesian networks to measure different types of correlations between codewords in linear prediction code and present F3SNet -- a four-step strategy: Embedding, Encoding, Attention and Classification for quantizaition index modulation steganalysis of compressed speech based on Hierarchical Attention Network. Among them, Embedding converts codewords into high-density numerical vectors, Encoding uses the memory characteristics of LSTM to retain more information by distributing it among all its vectors and Attention further determines which vectors have a greater impact on the final classification result. To evaluate the performance of F3SNet, we make comprehensive comparison of F3SNet with existing steganography methods. Experimental results show that F3SNet surpasses the state-of-the-art methods, particularly for small-sized and low embedding rate samples.

Citations (1)

Summary

We haven't generated a summary for this paper yet.

Slide Deck Streamline Icon: https://streamlinehq.com

Whiteboard

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.