Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Disinformation Warfare: Understanding State-Sponsored Trolls on Twitter and Their Influence on the Web (1801.09288v2)

Published 28 Jan 2018 in cs.SI

Abstract: Over the past couple of years, anecdotal evidence has emerged linking coordinated campaigns by state-sponsored actors with efforts to manipulate public opinion on the Web, often around major political events, through dedicated accounts, or "trolls." Although they are often involved in spreading disinformation on social media, there is little understanding of how these trolls operate, what type of content they disseminate, and most importantly their influence on the information ecosystem. In this paper, we shed light on these questions by analyzing 27K tweets posted by 1K Twitter users identified as having ties with Russia's Internet Research Agency and thus likely state-sponsored trolls. We compare their behavior to a random set of Twitter users, finding interesting differences in terms of the content they disseminate, the evolution of their account, as well as their general behavior and use of Twitter. Then, using Hawkes Processes, we quantify the influence that trolls had on the dissemination of news on social platforms like Twitter, Reddit, and 4chan. Overall, our findings indicate that Russian trolls managed to stay active for long periods of time and to reach a substantial number of Twitter users with their tweets. When looking at their ability of spreading news content and making it viral, however, we find that their effect on social platforms was minor, with the significant exception of news published by the Russian state-sponsored news outlet RT (Russia Today).

Disinformation Warfare: Understanding State-Sponsored Trolls on Twitter and Their Influence on the Web

The landscape of social media has become increasingly manipulated by disinformation campaigns sponsored by state actors, aiming to sway public opinion. This paper engages with the proliferating concern of state-sponsored actors using social media platforms to orchestrate misinformation strategies. Focusing specifically on Twitter, the paper concentrates on Russian actors associated with the Internet Research Agency (IRA), a known nexus for such actions.

Study Design and Objectives

The paper centers around a dataset consisting of approximately 27,000 tweets from 1,000 users identified as Russian trolls and seeks to juxtapose their behaviors with a randomly selected group of Twitter users. Concomitantly, the researchers devise a statistical approach to quantify the influence of these actors across numerous online social platforms, including Twitter, Reddit, and 4chan.

Analysis and Findings

A multifaceted analysis uncovers key behavioral patterns typical of the identified state-sponsored trolls:

  • Content Dissemination: The paper elucidates that while these trolls disseminate information widely, their impact on making content viral is relatively insignificant except in the case of Russian state-sponsored media outlets, such as RT (Russia Today). This outcome is particularly interesting when contextualized against the substantial volume of tweets and the geotagging strategies employed by these accounts to appear locally originated in strategic regions such as the USA, Germany, and Russia.
  • Behavioral Patterns: The trolls exhibit distinct strategies like periodically resetting their profiles, modifying screen names and profiles to project different identities. This methodology hints at a persistent tactic aimed at maintaining relevance and avoiding detection while perpetuating their agendas.
  • Engagement and Use Patterns: Further adding to the broader understanding, Russian trolls predominantly utilize web clients to tweet, differing from the mobile client preference seen in average users. Additionally, the language and sentiment analyses underscore a proclivity towards emotionally charged, subjective content aimed at contentious political themes and misinformation, particularly reflecting issues active in global discourse.

Implications and Speculations for Future Developments

From a theoretical standpoint, the paper extends the established discutient of how state-backed actors manipulate digital platforms to interfere with sociopolitical processes. It also raises questions on the broader ramifications such influence models could impose on global democratic systems. Practically, the paper underscores an urgent requirement for the creation of sophisticated tools capable of identifying and mitigating the influence of state-sponsored trolls efficiently.

Conclusion

The limited reach of Russian trolls, as highlighted by the paper, alludes to a nuanced landscape where vast disinformation networks and formidable efforts might not correlate directly with widespread influence; nevertheless, the significant reaction to Russian state-media links should not be understated. This paper provides critical insights that enrich the discourse around digital misinformation and exemplifies the complex strategies employed by malicious actors, reinforcing the necessity for robust research and solution-oriented approaches to counteract these digital threats effectively.

In encapsulating the activities and influence of Russian trolls, this research contributes foundational knowledge that informs both present and future defenses against state-sponsored disinformation campaigns within the digital landscape.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Savvas Zannettou (55 papers)
  2. Tristan Caulfield (11 papers)
  3. Emiliano De Cristofaro (117 papers)
  4. Michael Sirivianos (24 papers)
  5. Gianluca Stringhini (77 papers)
  6. Jeremy Blackburn (76 papers)
Citations (198)
Youtube Logo Streamline Icon: https://streamlinehq.com