Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Transformer-based Language Model Fine-tuning Methods for COVID-19 Fake News Detection (2101.05509v3)

Published 14 Jan 2021 in cs.CL and cs.AI

Abstract: With the pandemic of COVID-19, relevant fake news is spreading all over the sky throughout the social media. Believing in them without discrimination can cause great trouble to people's life. However, universal LLMs may perform weakly in these fake news detection for lack of large-scale annotated data and sufficient semantic understanding of domain-specific knowledge. While the model trained on corresponding corpora is also mediocre for insufficient learning. In this paper, we propose a novel transformer-based LLM fine-tuning approach for these fake news detection. First, the token vocabulary of individual model is expanded for the actual semantics of professional phrases. Second, we adapt the heated-up softmax loss to distinguish the hard-mining samples, which are common for fake news because of the disambiguation of short text. Then, we involve adversarial training to improve the model's robustness. Last, the predicted features extracted by universal LLM RoBERTa and domain-specific model CT-BERT are fused by one multiple layer perception to integrate fine-grained and high-level specific representations. Quantitative experimental results evaluated on existing COVID-19 fake news dataset show its superior performances compared to the state-of-the-art methods among various evaluation metrics. Furthermore, the best weighted average F1 score achieves 99.02%.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Ben Chen (23 papers)
  2. Bin Chen (547 papers)
  3. Dehong Gao (26 papers)
  4. Qijin Chen (43 papers)
  5. Chengfu Huo (7 papers)
  6. Xiaonan Meng (7 papers)
  7. Weijun Ren (20 papers)
  8. Yang Zhou (311 papers)
Citations (40)

Summary

We haven't generated a summary for this paper yet.