Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

"How over is it?" Understanding the Incel Community on YouTube (2001.08293v7)

Published 22 Jan 2020 in cs.CY

Abstract: YouTube is by far the largest host of user-generated video content worldwide. Alas, the platform has also come under fire for hosting inappropriate, toxic, and hateful content. One community that has often been linked to sharing and publishing hateful and misogynistic content are the Involuntary Celibates (Incels), a loosely defined movement ostensibly focusing on men's issues. In this paper, we set out to analyze the Incel community on YouTube by focusing on this community's evolution over the last decade and understanding whether YouTube's recommendation algorithm steers users towards Incel-related videos. We collect videos shared on Incel communities within Reddit and perform a data-driven characterization of the content posted on YouTube. Among other things, we find that the Incel community on YouTube is getting traction and that, during the last decade, the number of Incel-related videos and comments rose substantially. We also find that users have a 6.3% chance of being suggested an Incel-related video by YouTube's recommendation algorithm within five hops when starting from a non Incel-related video. Overall, our findings paint an alarming picture of online radicalization: not only Incel activity is increasing over time, but platforms may also play an active role in steering users towards such extreme content.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Kostantinos Papadamou (8 papers)
  2. Savvas Zannettou (55 papers)
  3. Jeremy Blackburn (76 papers)
  4. Emiliano De Cristofaro (117 papers)
  5. Gianluca Stringhini (77 papers)
  6. Michael Sirivianos (24 papers)
Citations (53)

Summary

Analysis of the Evolution and Impact of the Incel Community on YouTube

The paper "How over is it? Understanding the Incel Community on YouTube" presents a comprehensive paper of the Involuntary Celibates (Incels) community, focusing on its presence and growth on YouTube over the past decade. Leveraging a rich dataset of YouTube videos shared in Incel-related subreddits, the authors aim to characterize the content, evaluate its evolution, and understand the influence of YouTube's recommendation algorithm in steering users towards Incel content. This research is pivotal given the community's association with misogyny and reports of contributing to real-world violence.

Key Findings

  1. Growth of Incel-related Content: The paper highlights a notable increase in the activity related to Incel ideology on YouTube over recent years. This increase is evident both in the volume of Incel-related videos and the comments using Incel-specific terminology, suggesting a rising trend of Incels utilizing YouTube to propagate their views.
  2. Impact of Platform Migration: Despite measures like Reddit banning specific subreddits to curb such content, Incel-related activities have seen a surge. This aligns with previous findings that indicate the resilience and adaptability of such communities, migrating to other platforms like YouTube, thereby potentially circumventing moderation efforts.
  3. YouTube's Role in Content Dissemination: The research finds that approximately 2.9% of the recommendation graph from Incel-derived videos and 1.5% from control videos are Incel-related. Notably, when a user watches non-Incel content, the algorithm has a 6.3% probability of suggesting Incel-related content within five video recommendations, highlighting YouTube’s inadvertent role in exposing users to such extreme content.
  4. Recommendation Algorithm and User Steering: The algorithm tends to steer users deeper into Incel-related territory, particularly when users show an initial interest in such content. The likelihood of being recommended more Incel content increases with consecutive views of Incel-related videos, underscoring the potential of the platform's algorithm to create an echo chamber.

Implications and Future Work

The findings underscore a multifaceted challenge in moderating Incel-related and broader extremist content online. The paper's results suggest that while platform-specific moderation (such as subreddit bans) disrupts communication channels, it does not deter community activity. The role of YouTube's recommendation algorithm in potentially nudging users towards extremist content necessitates further examination and calls for a reevaluation of content recommendation strategies that prioritize user engagement over safety.

For future work, a cross-platform analysis would be invaluable in capturing the full extent of this community's impact and evolution. Investigating the personalized effects of recommendation systems and conducting qualitative studies involving user experiences may provide additional insights into effective mitigation strategies. Researchers should continue to explore algorithmic transparency and develop frameworks that address the inadvertent promotion of harmful content while balancing user engagement.

Conclusion

This paper critically examines the intersection of social media, algorithmic influence, and extremist communities, providing a deeper understanding of how platforms like YouTube contribute to the spread of harmful ideologies. The Incel community serves as a poignant case paper of the broader challenges posed by technology in moderating hate and ensuring community safety. Addressing these challenges requires cooperative efforts across platforms and continuous adaptation of moderation policies to effectively counter the adaptive strategies of radical communities.

Youtube Logo Streamline Icon: https://streamlinehq.com