Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
143 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Exposing Cross-Platform Coordinated Inauthentic Activity in the Run-Up to the 2024 U.S. Election (2410.22716v4)

Published 30 Oct 2024 in cs.SI

Abstract: Coordinated information operations remain a persistent challenge on social media, despite platform efforts to curb them. While previous research has primarily focused on identifying these operations within individual platforms, this study shows that coordination frequently transcends platform boundaries. Leveraging newly collected data of online conversations related to the 2024 U.S. Election across $\mathbb{X}$ (formerly, Twitter), Facebook, and Telegram, we construct similarity networks to detect coordinated communities exhibiting suspicious sharing behaviors within and across platforms. Proposing an advanced coordination detection model, we reveal evidence of potential foreign interference, with Russian-affiliated media being systematically promoted across Telegram and $\mathbb{X}$. Our analysis also uncovers substantial intra- and cross-platform coordinated inauthentic activity, driving the spread of highly partisan, low-credibility, and conspiratorial content. These findings highlight the urgent need for regulatory measures that extend beyond individual platforms to effectively address the growing challenge of cross-platform coordinated influence campaigns.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (73)
  1. Linguistic cues to deception: Identifying political trolls on social media. In Proceedings of the international AAAI conference on web and social media, Vol. 13. 15–25.
  2. Content-based features predict social media influence operations. Science advances 6, 30 (2020), eabb5824.
  3. Characterizing the 2016 Russian IRA influence campaign. Social Network Analysis and Mining 9 (2019), 1–11.
  4. Analyzing the digital traces of political manipulation: The 2016 Russian interference Twitter campaign. In 2018 IEEE/ACM international conference on advances in social networks analysis and mining (ASONAM). IEEE, 258–265.
  5. A Public Dataset Tracking Social Media Discourse about the 2024 U.S. Presidential Election on Twitter/X (Under Review). HUMANS Lab – Working Paper No. 2024.6 (2024).
  6. A Public Dataset Tracking Social Media Discourse about the 2024 U.S. Presidential Election on Twitter/X. Technical Report. HUMANS Lab – Working Paper No. 2024.6.
  7. News and misinformation consumption: A temporal comparison across European countries. Plos one 19, 5 (2024), e0302473.
  8. Alessandro Bessi and Emilio Ferrara. 2016. Social bots distort the 2016 US Presidential election online discussion. First monday 21, 11-7 (2016).
  9. Unearthing a Billion Telegram Posts about the 2024 U.S. Presidential Election: Development of a Public Dataset. Technical Report. HUMANS Lab – Working Paper No. 2024.5.
  10. Unearthing a Billion Telegram Posts about the 2024 U.S. Presidential Election: Development of a Public Dataset (Under Review). HUMANS Lab – Working Paper No. 2024.5 (2024).
  11. Coordinated behavior in information operations on Twitter. IEEE Access (2024).
  12. DNA-inspired online behavioral modeling and its application to spambot detection. IEEE Intelligent Systems 31, 5 (2016), 58–64.
  13. Jacob Davey and Julia Ebner. 2019. The Great Replacement’: The violent consequences of mainstreamed extremism. Institute for Strategic Dialogue 7 (2019), 1–36.
  14. Classifying Human-Generated and AI-Generated Election Claims in Social Media. arXiv preprint arXiv:2404.16116 (2024).
  15. Election Integrity Partnership. 2020. The Long Fuse: Misinformation and the 2020 Election. (2020). https://purl.stanford.edu/tr171zs0069
  16. Exposing influence campaigns in the age of LLMs: a behavioral-based AI approach to detecting state-sponsored trolls. EPJ Data Science 12, 46 (2023).
  17. Crisis Communication on Twitter: a social network analysis of christchurch terrorist attack in 2019. In 2019 International Conference on ICT for Smart Society (ICISS), Vol. 7. IEEE, 1–6.
  18. Emilio Ferrara. 2017. Disinformation and social bot operations in the run up to the 2017 French presidential election. First Monday (2017).
  19. Emilio Ferrara. 2024a. Charting the Landscape of Nefarious Uses of Generative Artificial Intelligence for Online Election Interference. Technical Report. HUMANS Lab – Working Paper No. 2024.1. https://arxiv.org/abs/2406.01862.
  20. Emilio Ferrara. 2024b. What Are The Risks of Living in a GenAI Synthetic Reality? Technical Report. HUMANS Lab – Working Paper No. 2024.2. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4883399.
  21. Characterizing social media manipulation in the 2020 US presidential election. First Monday (2020).
  22. Beyond the hashtags:# Ferguson,# Blacklivesmatter, and the online struggle for offline justice. Center for Media & Social Impact, American University, Forthcoming (2016).
  23. Inductive detection of influence operations via graph learning. Scientific Reports 13, 1 (2023), 22571.
  24. The interconnected nature of online harm and moderation: investigating the cross-platform spread of harmful content between youtube and Twitter. In Proceedings of the 34th ACM conference on hypertext and social media. 1–10.
  25. Detecting Coordinated Link Sharing Behavior on Facebook during the Italian Coronavirus Outbreak. AoIR Selected Papers of Internet Research (2020).
  26. Maarten Grootendorst. 2022. BERTopic: Neural topic modeling with a class-based TF-IDF procedure. arXiv preprint arXiv:2203.05794 (2022).
  27. Philip N Howard and Muzammil M Hussain. 2013. Democracy’s fourth wave?: digital media and the Arab Spring. Oxford University Press.
  28. Algorithms, bots, and political communication in the US 2016 election: The challenge of automated political communication for election law and administration. Journal of information technology & politics 15, 2 (2018), 81–93.
  29. The spread of propaganda by coordinated communities on social media. In Proceedings of the 14th ACM Web Science Conference 2022. 191–201.
  30. The Spread of Propaganda by Coordinated Communities on Social Media. In 14th ACM Web Science Conference 2022. ACM. https://doi.org/10.1145/3501247.3531543
  31. Social bots and their coordination during online campaigns: a survey. IEEE Transactions on Computational Social Systems 9, 2 (2021), 530–545.
  32. Ofra Klein and Jasper Muis. 2019. Online discontent: Comparing Western European far-right groups on Facebook. European societies 21, 4 (2019), 540–562.
  33. When the bad is good and the good is bad: understanding cyber social health through online behavioral change. IEEE Internet Computing 25, 1 (2021), 6–11.
  34. The influence of coordinated behavior on toxicity. Online Social Networks and Media 43 (2024), 100289.
  35. Alessandro Lovari. 2020. Spreading (dis) trust: Covid-19 misinformation and government intervention in Italy. Media and Communication 8, 2 (2020), 458–461.
  36. Leveraging Large Language Models to Detect Influence Campaigns on Social Media. In Companion Proceedings of the ACM on Web Conference 2024. 1459–1467.
  37. Social media against society. The Internet and the 2020 Campaign 1 (2021).
  38. Red bots do it better: Comparative analysis of social bot partisan behavior. In Companion proceedings of the 2019 world wide web conference. 1007–1012.
  39. Evolution of bot and human behavior during elections. First Monday (2019).
  40. Detecting troll behavior via inverse reinforcement learning: A case study of russian trolls in the 2016 us election. In Proceedings of the international AAAI conference on web and social media, Vol. 14. 417–427.
  41. Unmasking the web of deceit: Uncovering coordinated activity to expose information operations on twitter. In Proceedings of the ACM on Web Conference 2024. 2530–2541.
  42. The Susceptibility Paradox in Online Social Influence. arXiv preprint arXiv:2406.11553 (2024).
  43. A synchronized action framework for detection of coordination on social media. Journal of Online Trust and Safety 1, 2 (2022).
  44. Detection and characterization of coordinated online behavior: A survey. arXiv preprint arXiv:2408.01257 (2024).
  45. Uncovering Coordinated Cross-Platform Information Operations Threatening the Integrity of the 2024 US Presidential Election Online Discussion. arXiv preprint arXiv:2409.15402 (2024).
  46. Uncovering Coordinated Cross-Platform Information Operations Threatening the Integrity of the 2024 US Presidential Election Online Discussion. Technical Report. HUMANS Lab – Working Paper No. 2024.4. https://arxiv.org/abs/2409.15402.
  47. Vanessa Molter and Renee DiResta. 2020. Pandemics & propaganda: How Chinese state media creates and propagates CCP coronavirus narratives. Harvard Kennedy School Misinformation Review 1, 3 (2020).
  48. Monica Murero. 2023. Coordinated inauthentic behavior: An innovative manipulation tactic to amplify COVID-19 anti-vaccine communication outreach via social media. Frontiers in Sociology 8 (2023), 1141416.
  49. Lynnette Hui Xian Ng and Kathleen M Carley. 2022. Online coordination: methods and comparative case studies of coordinated groups across four events in the united states. In Proceedings of the 14th ACM Web Science Conference 2022. 12–21.
  50. Cross-platform information spread during the January 6th capitol riots. Social Network Analysis and Mining 12, 1 (2022), 133.
  51. Erik C Nisbet and Olga Kamenchuk. 2019. The psychology of state-sponsored disinformation campaigns and implications for public diplomacy. The Hague Journal of Diplomacy 14, 1-2 (2019), 65–82.
  52. Coordinated behavior on social media in 2019 UK general election. In Proceedings of the International AAAI Conference on Web and Social Media, Vol. 15. 443–454.
  53. Misinformation and Polarization around COVID-19 vaccines in France, Germany, and Italy. In Proceedings of the 16th ACM Web Science Conference. 119–128.
  54. A language framework for modeling social media account behavior. EPJ Data Science 12, 1 (2023), 33.
  55. Sarah Oates and John Gray. 2019. # Kremlin: Using hashtags to analyze Russian disinformation strategy and dissemination on twitter. Available at SSRN 3445180 (2019).
  56. Diogo Pacheco. 2024. Bots, Elections, and Controversies: Twitter Insights from Brazil’s Polarised Elections. In Proceedings of the ACM on Web Conference 2024. 2651–2659.
  57. Unveiling Coordinated Groups Behind White Helmets Disinformation. In Companion Proceedings of the Web Conference 2020. ACM. https://doi.org/10.1145/3366424.3385775
  58. Uncovering coordinated networks on social media: methods and case studies. In Proceedings of the international AAAI conference on web and social media, Vol. 15. 455–466.
  59. Challenges and opportunities in information manipulation detection: An examination of wartime Russian media. arXiv preprint arXiv:2205.12382 (2022).
  60. Concha Pérez-Curiel. 2020. Trend towards extreme right-wing populism on Twitter. An analysis of the influence on leaders, media and users. Communication & Society (2020), 175–192.
  61. Propaganda and misinformation on Facebook and Twitter during the Russian invasion of Ukraine. In Proceedings of the 15th ACM web science conference 2023. 65–74.
  62. Tracking the 2024 US Presidential Election Chatter on Tiktok: A Public Multimodal Dataset. Technical Report. HUMANS Lab – Working Paper No. 2024.3. https://arxiv.org/abs/2407.01471.
  63. Characterizing online engagement with disinformation and conspiracies in the 2020 US presidential election. In Proceedings of the international AAAI conference on web and social media, Vol. 16. 908–919.
  64. Multifaceted online coordinated behavior in the 2020 US presidential election. EPJ Data Science 13, 1 (2024), 33.
  65. Temporal Dynamics of Coordinated Online Behavior: Stability, Archetypes, and Influence. arXiv preprint arXiv:2301.06774 (2023).
  66. Tracking fringe and coordinated activity on Twitter leading up to the US Capitol attack. In Proceedings of the international AAAI conference on web and social media, Vol. 18. 1557–1570.
  67. Identifying and characterizing behavioral classes of radicalization within the qanon conspiracy on Twitter. In Proceedings of the international AAAI conference on web and social media, Vol. 17. 890–901.
  68. Derek Weber and Frank Neumann. 2021. Amplifying influence through coordinated behaviour in social networks. Social Network Analysis and Mining 11, 1 (2021), 111.
  69. Assembling strategic narratives: Information operations as collaborative work within an online community. Proceedings of the ACM on Human-Computer Interaction 2, CSCW (2018), 1–26.
  70. Arming the public with artificial intelligence to counter social bots. Human Behavior and Emerging Technologies 1, 1 (Jan 2019), 48–61. https://doi.org/10.1002/hbe2.115
  71. Characterizing the use of images in state-sponsored information warfare operations by Russian trolls on Twitter. In Proceedings of the International AAAI Conference on Web and Social Media, Vol. 14. 774–785.
  72. Disinformation warfare: Understanding state-sponsored trolls on Twitter and their influence on the web. In Companion proceedings of the 2019 world wide web conference. 218–226.
  73. Who let the trolls out? towards understanding state-sponsored trolls. In Proceedings of the 10th acm conference on web science. 353–362.
Citations (3)

Summary

  • The paper introduces a novel cross-platform analysis using network-based techniques and TF-IDF to detect coordinated inauthentic activity.
  • The paper reveals significant propagation of low-credibility and partisan content, highlighting systematic foreign interference through Russian-affiliated media.
  • The paper uncovers the use of AI-generated content to amplify conspiratorial narratives, underscoring the need for comprehensive regulatory measures.

Cross-Platform Coordination in Inauthentic Activities During the US 2024 Elections

The paper "The 2024 Election Integrity Initiative: Exposing Cross-Platform Coordinated Inauthentic Activity in the Run-Up to the 2024 U.S. Election" offers an insightful investigation into the prevalence and nature of cross-platform coordinated inauthentic activity (CoIA) surrounding the 2024 U.S. Presidential Election. The research, conducted by Federico Cinus, Marco Minici, Luca Luceri, and Emilio Ferrara, explores coordinated online disinformation campaigns that transcend individual social media platforms, a phenomenon often overlooked in single-platform studies.

Methodology and Data Collection

The authors have leveraged a large-scale dataset covering election-related online discussions from May to June 2024 on platforms such as X (formerly Twitter), Facebook, and Telegram. This cross-platform approach employs advanced network-based methodologies to detect CoIA by constructing similarity networks that capture coordinated user behaviors. The networks are built using techniques like TF-IDF transformation to identify co-sharing URL patterns and similar textual content across different users.

Key Findings

The analysis reveals significant intra-platform and cross-platform coordination promoting low-credibility and highly partisan content, with a distinct emphasis on narratives related to the 2024 U.S. Presidential Election. Noteworthy findings include:

  • Foreign Influence: The paper sheds light on the systematic promotion of Russian-affiliated media across Telegram and X, indicating potential foreign interference. It highlights the prevalence of domains like RT.com and Ruptly.tv within coordinated campaigns.
  • Narrative Amplification: Coordinated actors, especially those on Telegram, exhibited a clear slant towards promoting conspiratorial content involving public health, flat-earth theory, and various political topics like immigration and geopolitical tensions.
  • AI-Generated Content: The paper also uncovers the use of AI-generated content by coordinated actors, particularly on Telegram, where they were more prevalent compared to organic users.

Implications

The results demonstrate a pressing need for regulatory measures surpassing individual platform policies to mitigate the risks posed by cross-platform CoIA. By focusing on the complex dynamics of influence operations, the paper proposes the development of comprehensive countermeasures against inauthentic campaigns targeting election integrity.

Discussion and Future Directions

From a theoretical standpoint, this research advances the understanding of the mechanisms behind coordinated information operations, emphasizing the importance of considering cross-platform interactions in the paper of online disinformation. The innovative network-based framework offers an opportunity for further refinement, especially regarding statistical validation of coordination patterns and the integration of multimedia content analysis.

Future research directions could explore expanding the cross-platform coordination detection framework to include multimedia content shared on various social media platforms, thus broadening the data modalities examined. Moreover, the application of generative AI in creating synthetic media for disinformation campaigns warrants closer investigation, alongside the realization of statistical models to assess the identified coordination patterns' significance robustly.

In summary, this paper underscores the necessity of adopting a holistic and coordinated strategy to address the multifaceted challenges of inauthentic activities across social media landscapes. Its insights call for innovative regulatory and technological solutions to safeguard democratic processes against increasingly sophisticated forms of digital manipulation.