Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

EdinburghNLP at WNUT-2020 Task 2: Leveraging Transformers with Generalized Augmentation for Identifying Informativeness in COVID-19 Tweets (2009.06375v3)

Published 6 Sep 2020 in cs.CL, cs.IR, cs.LG, cs.SI, and stat.ML

Abstract: Twitter and, in general, social media has become an indispensable communication channel in times of emergency. The ubiquitousness of smartphone gadgets enables people to declare an emergency observed in real-time. As a result, more agencies are interested in programmatically monitoring Twitter (disaster relief organizations and news agencies). Therefore, recognizing the informativeness of a Tweet can help filter noise from the large volumes of Tweets. In this paper, we present our submission for WNUT-2020 Task 2: Identification of informative COVID-19 English Tweets. Our most successful model is an ensemble of transformers, including RoBERTa, XLNet, and BERTweet trained in a Semi-Supervised Learning (SSL) setting. The proposed system achieves an F1 score of 0.9011 on the test set (ranking 7th on the leaderboard) and shows significant gains in performance compared to a baseline system using FastText embeddings.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (1)
  1. Nickil Maveli (3 papers)
Citations (4)

Summary

We haven't generated a summary for this paper yet.