Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Deplatforming Norm-Violating Influencers on Social Media Reduces Overall Online Attention Toward Them (2401.01253v1)

Published 2 Jan 2024 in cs.SI and cs.CY

Abstract: From politicians to podcast hosts, online platforms have systematically banned (``deplatformed'') influential users for breaking platform guidelines. Previous inquiries on the effectiveness of this intervention are inconclusive because 1) they consider only few deplatforming events; 2) they consider only overt engagement traces (e.g., likes and posts) but not passive engagement (e.g., views); 3) they do not consider all the potential places users impacted by the deplatforming event might migrate to. We address these limitations in a longitudinal, quasi-experimental study of 165 deplatforming events targeted at 101 influencers. We collect deplatforming events from Reddit posts and then manually curate the data, ensuring the correctness of a large dataset of deplatforming events. Then, we link these events to Google Trends and Wikipedia page views, platform-agnostic measures of online attention that capture the general public's interest in specific influencers. Through a difference-in-differences approach, we find that deplatforming reduces online attention toward influencers. After 12 months, we estimate that online attention toward deplatformed influencers is reduced by -63% (95% CI [-75%,-46%]) on Google and by -43% (95% CI [-57%,-24%]) on Wikipedia. Further, as we study over a hundred deplatforming events, we can analyze in which cases deplatforming is more or less impactful, revealing nuances about the intervention. Notably, we find that both permanent and temporary deplatforming reduce online attention toward influencers; Overall, this work contributes to the ongoing effort to map the effectiveness of content moderation interventions, driving platform governance away from speculation.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (85)
  1. Deplatforming and the control of misinformation: Evidence from parler. Available at SSRN (2022).
  2. Understanding the effect of deplatforming on social networks. In Proceedings of the ACM Web Science Conference (2021).
  3. Exploring the “demand side” of online radicalization: Evidence from the canadian context. Studies in Conflict & Terrorism 43, 7 (2020), 616–637.
  4. NudgeCred: Supporting News Credibility Assessment on Social Media Through Nudges. Proceedings of the ACM on Human-Computer Interaction 5, CSCW2 (Oct. 2021), 1–30.
  5. A practitioner’s guide to cluster-robust inference. Journal of human resources 50, 2 (2015), 317–372.
  6. The effect of minimum wages on low-wage jobs. The Quarterly Journal of Economics 134, 3 (2019), 1405–1454.
  7. Center, S. P. L. Alt-right celebrity @ricky_vaughn99 suspended from twitter, 2016. [Online; accessed 18-Jan-2023.
  8. Quarantined! Examining the Effects of a Community-Wide Moderation Intervention on Reddit. ACM Transactions on Computer-Human Interaction 29, 4 (Mar. 2022), 29:1–29:26.
  9. You can’t stay here: The efficacy of reddit’s 2015 ban examined through hate speech. Proceedings of the ACM on Human-Computer Interaction 1, CSCW (2017), 1–22.
  10. Subscriptions and external links help drive resentful users to alternative and extremist youtube channels. Science Advances 9, 35 (2023), eadd8080.
  11. Predicting the present with google trends. Economic record 88 (2012), 2–9.
  12. Real Solutions for Fake News? Measuring the Effectiveness of General Warnings and Fact-Check Tags in Reducing Belief in False Stories on Social Media. Political Behavior 42, 4 (Dec. 2020), 1073–1095.
  13. CNBC. Twitter permanently bans Alex Jones and Infowars accounts. https://www.cnbc.com/2018/09/06/twitter-permanently-bans-alex-jones-and-infowars-accounts.html. Accessed: 2021-11-10.
  14. Coaston, J. YouTube, Facebook, and Apple’s ban on Alex Jones, explained, Aug. 2018.
  15. Explicit warnings reduce but do not eliminate the continued influence of misinformation. Memory & Cognition 38, 8 (Dec. 2010), 1087–1100.
  16. {Facebook Oversight Board}. Oversight Board upholds former President Trump’s suspension, finds Facebook failed to impose proper penalty | Oversight Board, 2021.
  17. Freelon, D. Computational research in the post-api age. Political Communication 35, 4 (2018), 665–668.
  18. False equivalencies: Online activism from left to right. Science, 6508 (2020), 1197–1201.
  19. To Label or Not to Label: The Effect of Stance and Credibility Labels on Readers’ Selection and Perception of News Articles. Proceedings of the ACM on Human-Computer Interaction 2, CSCW (Nov. 2018), 55:1–55:16.
  20. Gelman, A. The Secret Weapon. https://statmodeling.stat.columbia.edu/2005/03/07/the_secret_weap/. Accessed: 2021-11-10.
  21. Goodman-Bacon, A. Difference-in-differences with variation in treatment timing. Journal of Econometrics 225, 2 (2021), 254–277.
  22. Algorithmic content moderation: Technical and political challenges in the automation of platform governance. Big Data & Society 7, 1 (2020), 2053951719897945.
  23. Guardian, T. Facebook bans alex jones, milo yiannopoulos and other far-right figures, 2019. [Online; accessed 18-Jan-2023.
  24. Automated Content Moderation Increases Adherence to Community Guidelines. In Proceedings of the ACM Web Conference 2023 (New York, NY, USA, Apr. 2023), WWW ’23, Association for Computing Machinery, pp. 2666–2676.
  25. Deplatforming did not decrease parler users’ activity on fringe social media. PNAS nexus 2, 3 (2023), pgad035.
  26. Do platform migrations compromise content moderation? evidence from r/the_donald and r/incels. Proceedings of the ACM on Human-Computer Interaction 5, CSCW2 (2021), 1–24.
  27. Examining the consumption of radical content on youtube. Proceedings of the National Academy of Sciences 118, 32 (2021), e2101967118.
  28. Causally estimating the effect of youtube’s recommender system using counterfactual bots. arXiv preprint arXiv:2308.10398 (2023).
  29. Evaluating the effectiveness of deplatforming as a moderation strategy on twitter. Proceedings of the ACM on Human-Computer Interaction 5, CSCW2 (2021), 1–30.
  30. Jiménez-Durán, R. The Economics of Content Moderation: Theory and Experimental Evidence from Hate Speech on Twitter, Nov. 2023.
  31. Hidden resilience and adaptive dynamics of the global online hate ecology. Nature 573, 7773 (2019), 261–265.
  32. Langvardt, K. Regulating online content moderation. Geo. LJ 106 (2017), 1353.
  33. Lazer, D. Studying human attention on the internet. Proceedings of the National Academy of Sciences 117, 1 (2020), 21–22.
  34. Upvote my news: The practices of peer information aggregation for breaking news on reddit. com. Proceedings of the ACM on Human-Computer Interaction 1, CSCW (2017), 1–18.
  35. "Learn the Facts about COVID-19": Analyzing the Use of Warning Labels on TikTok Videos. Proceedings of the International AAAI Conference on Web and Social Media 17 (June 2023), 554–565.
  36. Luo, Z. “why should facebook (not) ban trump?”: connecting divides in reasoning and morality in public deliberation. Information, Communication & Society 25, 5 (2022), 654–668.
  37. The substantial interdependence of wikipedia and google: A case study on the relationship between peer production communities and information technologies. In Proceedings of the International AAAI Conference on Web and Social Media (2017), vol. 11, pp. 142–151.
  38. Mena, P. Cleaning Up Social Media: The Effect of Warning Labels on Likelihood of Sharing False News on Facebook. Policy & Internet 12, 2 (2020), 165–183. _eprint: https://onlinelibrary.wiley.com/doi/pdf/10.1002/poi3.214.
  39. Merriam-Webster. Deplatform, 2018. [Online; accessed 18-Jan-2023.
  40. Meta. Meta Community Standards. https://transparency.fb.com/policies/community-standards/. Accessed: 2021-11-11.
  41. Using wikipedia to measure public interest in biodiversity and conservation. Conservation Biology 35, 2 (2021), 412–423.
  42. Mitts, T. Banned: How deplatforming extremists mobilizes hate in the dark corners of the internet.
  43. Online conspiracy communities are more resilient to deplatforming. arXiv preprint arXiv:2303.12115 (2023).
  44. User migration in online social networks: A case study on reddit during a period of community unrest. In Proceedings of the International AAAI Conference on Web and Social Media (2016), vol. 10, pp. 279–288.
  45. Northeastern. Observatory for Online Human and Platform Behavior, 2023.
  46. Nunziato, D. C. Protecting free speech and due process values on dominant social media platforms. Hastings LJ 73 (2022), 1255.
  47. The use of google trends in health care research: a systematic review. PloS one 9, 10 (2014), e109583.
  48. Quootstrap: Scalable unsupervised extraction of quotation-speaker pairs from large news corpora via bootstrapping. In Twelfth International AAAI Conference on Web and Social Media (2018).
  49. From freebase to wikidata: The great migration. In Proceedings of the 25th International Conference on World Wide Web (2016).
  50. On the value of wikipedia as a gateway to the web. In Proceedings of the Web Conference 2021 (2021), pp. 249–260.
  51. Political Misinformation and Factual Corrections on the Facebook News Feed: Experimental Evidence. The Journal of Politics 84, 3 (July 2022), 1812–1817. Publisher: The University of Chicago Press.
  52. Deplatforming the far-right: An analysis of youtube and bitchute. Available at SSRN (2021).
  53. Reuters. Google sued by anti-vax doctor over youtube ban, 2022. [Online; accessed 18-Jan-2023.
  54. Media cloud: Massive open source collection of global news on the open web. In Proceedings of the International AAAI Conference on Web and Social Media (2021), vol. 15, pp. 1034–1045.
  55. Rogers, R. Deplatforming: Following extreme internet celebrities to telegram and alternative social media. European Journal of Communication 35, 3 (2020), 213–229.
  56. I read it on reddit: Exploring the role of online communities in the 2016 us elections news cycle. In Social Informatics: 9th International Conference, SocInfo 2017, Oxford, UK, September 13-15, 2017, Proceedings, Part II 9 (2017), Springer, pp. 192–220.
  57. Understanding online migration decisions following the banning of radical communities. arXiv preprint arXiv:2212.04765 (2022).
  58. Spillover of antisocial behavior from fringe platforms: The unintended consequences of community banning.
  59. Schneider, A. Twitter bans alex jones and infowars; cites abusive behavior. NPR News (2018).
  60. Decolonizing content moderation: Does uniform global community standard resemble utopian equality or western power hegemony? In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (2023), pp. 1–18.
  61. Why we read wikipedia. In Proceedings of the 26th international conference on world wide web (2017), pp. 1591–1600.
  62. Introducing the knowledge graph: things, not strings. Official google blog 5, 16 (2012), 3.
  63. Content Removal as a Moderation Strategy: Compliance and Other Outcomes in the ChangeMyView Community. Proceedings of the ACM on Human-Computer Interaction 3, CSCW (Nov. 2019), 163:1–163:21.
  64. What do we mean when we talk about transparency? toward meaningful transparency in commercial content moderation. International Journal of Communication 13 (2019), 18.
  65. Disrupting hate: The effect of deplatforming hate organizations on their online audience. Proceedings of the National Academy of Sciences 120, 24 (June 2023), e2214080120. Publisher: Proceedings of the National Academy of Sciences.
  66. Make reddit great again: assessing community effects of moderation interventions on r/the_donald. Proceedings of the ACM on Human-Computer Interaction 6, CSCW2 (2022), 1–28.
  67. One of Many: Assessing User-level Effects of Moderation Interventions on r/The_donald. In Proceedings of the 15th ACM Web Science Conference 2023 (New York, NY, USA, Apr. 2023), WebSci ’23, Association for Computing Machinery, pp. 55–64.
  68. Twitter. Twitter Rules and Policies. https://help.twitter.com/en/rules-and-policies/. Accessed: 2021-11-11.
  69. What they do in the shadows: examining the far-right networks on telegram. Information, communication & society 25, 7 (2022), 904–923.
  70. Verge, T. Reddit announces new anti-harassment rules, 2015. [Online; accessed 18-Jan-2023.
  71. Verge, T. Reddit bans ’fat people hate’ and other subreddits under new harassment rules, 2015. [Online; accessed 18-Jan-2023.
  72. Measuring the importance of user-generated content to search engines. In Proceedings of the International AAAI Conference on Web and Social Media (2019), vol. 13, pp. 505–516.
  73. Vox. YouTube, Facebook, and Apple’s ban on Alex Jones, explained. https://www.vox.com/2018/8/6/17655658/alex-jones-facebook-youtube-conspiracy-theories. Accessed: 2021-11-10.
  74. Wikidata: a free collaborative knowledgebase. Communications of the ACM 57, 10 (2014), 78–85.
  75. No easy way out: The effectiveness of deplatforming an extremist forum to suppress hate and harassment. arXiv preprint arXiv:2304.07037 (2023).
  76. West, R. Calibration of google trends time series. In CIKM (2020).
  77. West, R. Calibration of google trends time series. In Proceedings of the 29th ACM International Conference on Information & Knowledge Management (2020), pp. 2257–2260.
  78. WHO. Managing the covid-19 infodemic, 2020. [Online; accessed 18-Jan-2023.
  79. Wikidata. Organization (q43229), 2024. [Online; accessed 02-Jan-2024.
  80. Wikipedia. Manual of Style. https://en.wikipedia.org/wiki/Wikipedia:Manual_of_Style/Lead_section#First_sentence. Accessed: 2021-11-10.
  81. Wikipedia traffic data and electoral prediction: towards theoretically informed models. EPJ Data Science 5 (2016), 1–15.
  82. Zannettou, S. "I Won the Election!": An Empirical Analysis of Soft Moderation Interventions on Twitter. Proceedings of the International AAAI Conference on Web and Social Media 15 (May 2021), 865–876.
  83. On the origins of memes by means of fringe web communities. In Proceedings of the internet measurement conference 2018 (2018), pp. 188–202.
  84. The web centipede: understanding how web communities influence each other through the lens of mainstream and alternative news sources. In Proceedings of the 2017 internet measurement conference (2017), pp. 405–417.
  85. Deplatforming our way to the alt-tech ecosystem. Knight First Amendment Institute at Columbia University, January 11 (2021).
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Manoel Horta Ribeiro (44 papers)
  2. Shagun Jhaver (19 papers)
  3. Jordi Cluet i Martinell (1 paper)
  4. Marie Reignier-Tayar (1 paper)
  5. Robert West (154 papers)
Citations (3)

Summary

  • The paper employs a stacked difference-in-differences design to show that deplatforming results in a 63% reduction on Google Trends and a 43% drop on Wikipedia after 12 months.
  • The paper uses a longitudinal, quasi-experimental approach analyzing 165 deplatforming events of 101 influencers to capture causal effects.
  • The paper reveals that both temporary and permanent bans achieve similar declines in attention, with misinformation spreaders experiencing even greater reductions.

Effects of Deplatforming on Online Attention Toward Norm-Violating Influencers

The paper at hand provides a detailed analysis of the effects of deplatforming influencers on social media and the resultant reduction in online attention toward these individuals. Through a longitudinal, quasi-experimental paper, the authors address the limitations present in previous research, specifically the narrow focus on a few deplatforming events and the exclusive consideration of overt engagement metrics. This paper encompasses a comprehensive examination of 165 deplatforming events involving 101 influencers, using both Google Trends and Wikipedia pageviews as proxies for online attention.

The methodology utilized is a stacked difference-in-differences (DiD) approach, which enables the disentanglement of causal effects from the general social interest. The authors find that, on average, deplatforming results in a -63\% reduction in online attention when measured through Google and a -43\% reduction on Wikipedia after 12 months. These findings lend robust support to the hypothesis that deplatforming is an effective mitigation strategy to reduce the exposure and influence of norm-violating figures across the digital ecosystem.

The paper also contributes significant insights into the nuanced effectiveness of deplatforming. For instance, both temporary and permanent deplatforms yield similar reductions in attention, suggesting that even time-limited sanctions can effectively curb the influence of controversial figures. Furthermore, influencers associated with misinformation dissemination experience a comparatively larger decrease in attention following deplatforming than those associated with other types of norm violations.

An intriguing aspect of the paper is how it leverages platform-agnostic engagement metrics. Unlike previous studies focused purely on social media metrics such as posts and likes, this research taps into broader digital behavior. Metrics from Google Trends and Wikipedia offer a more holistic measure of public attention, paving the way for future research that extends beyond direct engagement metrics.

The theoretical implications of this research point towards a more strategic approach in content moderation policies. Platforms might consider employing temporary bans more frequently as they appear to suffice in reducing unwanted attention while mitigating potential backlash associated with permanent exclusions. Moreover, the effective targeting of misinformation actors suggests that strategies might need to be tailored according to the type of norm violation to maximize the effectiveness of interventions.

Future research could further expand on these findings by exploring the causal pathways through which deplatformed influencers manage to migrate and rebuild their audience on alternative platforms. Another area for exploration could be the broader societal impact of deplatforming, potentially examining whether it contributes to the fragmentation of discourse across mainstream and alternative digital spaces.

Overall, this paper makes a valuable contribution to the literature on content moderation strategies by empirically demonstrating the efficacy of deplatforming in a comprehensive and data-driven manner. It provides a framework that can be further built upon to adaptively govern online spaces in an era where digital activity is intricately tied to societal discourse.