Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Auditing Radicalization Pathways on YouTube (1908.08313v4)

Published 22 Aug 2019 in cs.CY and cs.SI

Abstract: Non-profits, as well as the media, have hypothesized the existence of a radicalization pipeline on YouTube, claiming that users systematically progress towards more extreme content on the platform. Yet, there is to date no substantial quantitative evidence of this alleged pipeline. To close this gap, we conduct a large-scale audit of user radicalization on YouTube. We analyze 330,925 videos posted on 349 channels, which we broadly classified into four types: Media, the Alt-lite, the Intellectual Dark Web (I.D.W.), and the Alt-right. According to the aforementioned radicalization hypothesis, channels in the I.D.W. and the Alt-lite serve as gateways to fringe far-right ideology, here represented by Alt-right channels. Processing 72M+ comments, we show that the three channel types indeed increasingly share the same user base; that users consistently migrate from milder to more extreme content; and that a large percentage of users who consume Alt-right content now consumed Alt-lite and I.D.W. content in the past. We also probe YouTube's recommendation algorithm, looking at more than 2M video and channel recommendations between May/July 2019. We find that Alt-lite content is easily reachable from I.D.W. channels, while Alt-right videos are reachable only through channel recommendations. Overall, we paint a comprehensive picture of user radicalization on YouTube.

An Overview of "Auditing Radicalization Pathways on YouTube"

The paper "Auditing Radicalization Pathways on YouTube" by Manoel Horta Ribeiro et al. provides a comprehensive quantitative analysis of the radicalization trajectories on YouTube, grounded in an extensive dataset of user comments and recommendation graphs. The authors explore a hypothesis that YouTube serves as a radicalization pipeline leading users from mainstream content to extreme far-right content, characterized as Alt-right, through intermediary stages represented by Alt-lite and Intellectual Dark Web (I.D.W.) communities.

The research dissects user interaction across these communities over a decade by examining a substantial dataset comprising 330,925 videos and over 72 million comments. This extensive audit focuses on three primary research questions: the growth dynamics of these channels, patterns of user migration towards more extreme content, and the influence of YouTube's recommendation system in enabling or facilitating this migration.

Growth and Engagement of Controversial Communities

The paper presents a clear pattern in the growth of the analyzed communities, particularly notable since 2015, aligning with significant socio-political events such as political campaigns and elections. The engagement metrics, such as likes and comments, are notably higher among the studied communities compared to mainstream media channels, indicating an increased audience engagement.

User Trajectories and Radicalization

A significant portion of the paper is dedicated to understanding user migration patterns. The paper convincingly demonstrates that a substantial fraction of users commenting on Alt-lite or I.D.W. content eventually progress to commenting on Alt-right content. This migration is not only persistent over the years but also forms a significant part of the Alt-right audience, suggesting the existence of a radicalization pathway facilitated by YouTube.

Role of Recommendation Algorithms

In examining recommendation algorithms, the authors simulate navigation on YouTube's recommendation graphs, revealing that Alt-lite content is often recommended from I.D.W. channels and that users can reach Alt-right channels largely through channel recommendations. This indicates that, while direct video recommendations play a minor role, channel recommendations significantly contribute to users finding extreme content even without logged personalization.

Implications and Future Work

The findings of this paper carry substantial implications for understanding how social media platforms potentially facilitate radicalization pathways. They highlight the persuasive role of recommendations in shaping user content consumption trajectories, albeit acknowledging limitations due to the snapshot nature of the collected data and lack of personalization considerations.

Continued research in this area could explore the specific narratives promoted within these communities and further clarify the causal mechanisms of radicalization. Furthermore, understanding the interplay between platform design and user psychology may lead to more robust strategies for mitigating the spread of extremist ideologies online.

In summary, this paper provides a rigorous framework for auditing user radicalization dynamics on large content-sharing platforms like YouTube, offering valuable insights into the nuanced ways that algorithm-driven content recommendations might steer societal discourse. This work lays the groundwork for future studies aiming to dissect the multi-faceted nature of online radicalization phenomena and to develop informed interventions in content platform governance.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Manoel Horta Ribeiro (44 papers)
  2. Raphael Ottoni (2 papers)
  3. Robert West (154 papers)
  4. VirgĂ­lio A. F. Almeida (6 papers)
  5. Wagner Meira (2 papers)
Citations (331)
X Twitter Logo Streamline Icon: https://streamlinehq.com
Youtube Logo Streamline Icon: https://streamlinehq.com