Title
Small-Scale Linguistic Steganalysis for Multi-Concealed Scenarios
Abstract
Recently, due to the considerable feature expression ability of neural networks, deep linguistic steganalysis methods have been greatly developed. However, there are still two issues that need to be ameliorated. First, the prevailing linguistic steganalysis methods rely heavily on massive training data, which is labor-intensive and time-consuming. Second, these methods implement steganalysis only in different weak-concealed scenarios, the stego texts in each of which have only a single language style and payload. But in practice, the intercepted network samples are probably the mixture of the stego texts that possess different language styles and payloads, in which the semantic spatial distribution may be more chaotic than that in weak-concealed scenarios, thus making steganalysis more difficult. To address the above issues, a novel linguistic steganalysis method is proposed in this letter. First, the pre-trained BERT language model is constructed as an embedder to compensate for the shortage of data. Then, in addition to learning local and global semantic features, a feature interaction module is designed for exploring mutual effects between them. Furthermore, besides the typical cross-entropy loss, triplet loss is also introduced for the model training. In this way, the proposed method can refine more comprehensive and discriminative deep features in the intricate semantic space. The performance of the proposed method is compared with the representative linguistic steganalysis methods on datasets of different scales, and the experimental results reveal the superiority of the proposed method.
Year
DOI
Venue
2022
10.1109/LSP.2021.3128372
IEEE Signal Processing Letters
Keywords
DocType
Volume
Linguistic steganalysis,neural networks,small-scale concealed scenarios,feature interaction
Journal
29
ISSN
Citations 
PageRank 
1070-9908
0
0.34
References 
Authors
0
3
Name
Order
Citations
PageRank
Yimin Xu100.34
Tengyun Zhao200.68
Ping Zhong34011.34