Title
MARV: Multi-task learning and Attention based Rumor Verification scheme for Social Media
Abstract
Considering individuals can freely post messages on social media platforms, there is a large amount of unverified information, so-called rumor spreading on these platforms, which seriously affects users' experience and even disturbs social order. The application of Multi-Task Learning (MTL) in the field of rumor verification has witnessed great development, which improves rumor verification performance through jointly training the main task of rumor verification and the auxiliary task of stance classification. However, traditional MTL based rumor verification schemes can't adaptively weight different positions of data sequence to effectively represent the sequence, and then affect the verification performance. This paper proposes a novel rumor verification scheme for social media, MARV, through effectively exploiting the MTL and multi-head attention mechanism. Specifically, first, the shared LSTM layer in MARV is used to effectively process and represent the tweet sequences, and generate the high-level virtual features. Then, in the branch of rumor verification task, the multi-head attention layer is used to accurately learn the local dependencies in the high-level representations extracted from the shared layer. The experimental results on the PHEME and the RumourEval datasets demonstrate that our proposed MARV scheme is superior to other MTL based rumor verification schemes. Moreover, we also investigated the impact of differently placing attention module on the MTL based rumor verification.
Year
DOI
Venue
2022
10.1109/ICCC55456.2022.9880848
2022 IEEE/CIC International Conference on Communications in China (ICCC)
Keywords
DocType
ISSN
Social platform,rumor verification,stance classification,Multi-Task Learning,Multi-Head Attention
Conference
2377-8644
ISBN
Citations 
PageRank 
978-1-6654-8481-7
0
0.34
References 
Authors
3
4
Name
Order
Citations
PageRank
Yufeng Wang100.34
Bo Zhang2419.80
Jianhua Ma31401148.82
Qun Jin400.68