Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Do Platform Migrations Compromise Content Moderation? Evidence from r/The_Donald and r/Incels (2010.10397v3)

Published 20 Oct 2020 in cs.CY

Abstract: When toxic online communities on mainstream platforms face moderation measures, such as bans, they may migrate to other platforms with laxer policies or set up their own dedicated websites. Previous work suggests that within mainstream platforms, community-level moderation is effective in mitigating the harm caused by the moderated communities. It is, however, unclear whether these results also hold when considering the broader Web ecosystem. Do toxic communities continue to grow in terms of their user base and activity on the new platforms? Do their members become more toxic and ideologically radicalized? In this paper, we report the results of a large-scale observational study of how problematic online communities progress following community-level moderation measures. We analyze data from r/The_Donald and r/Incels, two communities that were banned from Reddit and subsequently migrated to their own standalone websites. Our results suggest that, in both cases, moderation measures significantly decreased posting activity on the new platform, reducing the number of posts, active users, and newcomers. In spite of that, users in one of the studied communities (r/The_Donald) showed increases in signals associated with toxicity and radicalization, which justifies concerns that the reduction in activity may come at the expense of a more toxic and radical community. Overall, our results paint a nuanced portrait of the consequences of community-level moderation and can inform their design and deployment.

Citations (119)

Summary

  • The paper shows that deplatforming leads to a significant drop in posts and active users, indicating reduced overall community activity.
  • The study finds divergent effects, with r/The_Donald experiencing increased toxicity and radicalization while r/Incels maintained more stable content characteristics.
  • The research uses quantitative methods like regression discontinuity to reveal that user self-selection, not behavior shifts, drives increased activity on alternative platforms.

Insights into the Efficacy of Platform Migrations on Content Moderation

The paper "Do Platform Migrations Compromise Content Moderation? Evidence from r/The_Donald and r/Incels," presents an empirical investigation of the consequences when moderated online communities migrate from mainstream platforms like Reddit to alternative platforms with less stringent content moderation practices. The paper focuses on two notable instances of such migrations: the subreddits r/The_Donald and r/Incels, which were banned from Reddit and subsequently moved to standalone websites. By analyzing these events, the research aims to explore whether community-level moderation measures, such as platform bans, are effective at disrupting harmful behaviors or inadvertently contribute to increased toxicity and radicalization.

Results Overview

The researchers employed a comprehensive methodology leveraging quantitative data collected before and after the communities migrated to their respective new platforms. The primary metrics of interest included user activity, newcomer inflow, and changes in language use reflecting toxicity and radicalization. Using regression discontinuity and other statistical techniques, the paper compares the pre- and post-migration characteristics of these communities to draw insights about the broader implications of deplatforming.

  1. Reduction in Activity: Both communities experienced a marked decrease in activity levels post-migration. The overall number of posts and active users significantly dropped on the new platforms compared to the original Reddit subreddits. This decline could suggest the effectiveness of bans in curbing the reach and engagement of toxic communities.
  2. Changes in Toxicity and Radicalization: A nuanced finding emerged regarding content. In the TD community, signals suggested an increase in toxicity and ideological radicalization following the migration, such as a rise in hostile language and group-identifying pronouns. Contrastingly, the Incel community exhibited fewer changes, indicating variability in migration impacts across different groups.
  3. Self-Selection vs. Behavior Change: Analysis of user behavior, specifically among those matched by usernames across platforms, indicated that the increase in relative activity in the new environment was primarily attributable to self-selection rather than changes in posting behavior. More active Reddit users were more likely to migrate, but they did not escalate their activity further on the new platforms.

Implications and Speculations

The paper provides critical insights into the complexity of moderating toxic internet contents. While platform bans have the immediate effect of reducing a community's online presence and newcomer attraction, they present potential downsides regarding concentrated toxicity levels in the migrated segments, as showcased by the TD community findings. This presents a double-edged sword to platform administrators: halting negative community impact on one platform might catalyze concentrated radicalism in independent locations.

From a theoretical perspective, these results contribute to the discourse on the efficacy of content moderation strategies and their unintended consequences in the digital ecosystem. Practically, they highlight the importance of a comprehensive content moderation policy that considers both direct and ripple effects across online spaces.

Future research is advised to expand on these results by evaluating a broader set of communities, examining different types of migration patterns, and incorporating multi-platform dynamics. Additionally, further exploration into causal mechanisms linking platform migration to real-world impacts would enhance understanding of the genuine efficacy of deplatforming as a mitigation strategy.

Overall, this paper is a valuable contribution to the ongoing examination of the implications of digital content moderation, providing a grounded analysis that challenges traditional assumptions regarding the outcomes of deplatforming actions.

Youtube Logo Streamline Icon: https://streamlinehq.com