Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
194 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Lexicon generation for detecting fake news (2010.11089v1)

Published 16 Oct 2020 in cs.CL

Abstract: With the digitization of media, an immense amount of news data has been generated by online sources, including mainstream media outlets as well as social networks. However, the ease of production and distribution resulted in circulation of fake news as well as credible, authentic news. The pervasive dissemination of fake news has extreme negative impacts on individuals and society. Therefore, fake news detection has recently become an emerging topic as an interdisciplinary research field that is attracting significant attention from many research disciplines, including social sciences and linguistics. In this study, we propose a method primarily based on lexicons including a scoring system to facilitate the detection of the fake news in Turkish. We contribute to the literature by collecting a novel, large scale, and credible dataset of Turkish news, and by constructing the first fake news detection lexicon for Turkish.

Citations (2)

Summary

We haven't generated a summary for this paper yet.