Papers
Topics
Authors
Recent
2000 character limit reached

TikTok's recommendations skewed towards Republican content during the 2024 U.S. presidential race (2501.17831v2)

Published 29 Jan 2025 in cs.SI and cs.CY

Abstract: TikTok is a major force among social media platforms with over a billion monthly active users worldwide and 170 million in the United States. The platform's status as a key news source, particularly among younger demographics, raises concerns about its potential influence on politics in the U.S. and globally. Despite these concerns, there is scant research investigating TikTok's recommendation algorithm for political biases. We fill this gap by conducting 323 independent algorithmic audit experiments testing partisan content recommendations in the lead-up to the 2024 U.S. presidential elections. Specifically, we create hundreds of "sock puppet" TikTok accounts in Texas, New York, and Georgia, seeding them with varying partisan content and collecting algorithmic content recommendations for each of them. Collectively, these accounts viewed ~394,000 videos from April 30th to November 11th, 2024, which we label for political and partisan content. Our analysis reveals significant asymmetries in content distribution: Republican-seeded accounts received ~11.8% more party-aligned recommendations compared to their Democratic-seeded counterparts, and Democratic-seeded accounts were exposed to ~7.5% more opposite-party recommendations on average. These asymmetries exist across all three states and persist when accounting for video- and channel-level engagement metrics such as likes, views, shares, comments, and followers, and are driven primarily by negative partisanship content. Our findings provide insights into the inner workings of TikTok's recommendation algorithm during a critical election period, raising fundamental questions about platform neutrality.

Summary

  • The paper demonstrates that Republican-conditioned accounts received 11.8% more ideologically matched recommendations than Democratic ones.
  • It employed controlled sock puppet experiments in Texas, New York, and Georgia to systematically assess partisan biases in TikTok's recommendations.
  • Findings reveal systematic discrepancies in content topics, raising concerns about TikTok’s role in shaping political narratives and news dissemination.

TikTok's Recommendations Skewed Towards Republican Content During the 2024 U.S. Presidential Race

This essay provides a comprehensive analysis of the paper "TikTok's recommendations skewed towards Republican content during the 2024 U.S. presidential race" (2501.17831). The paper explores the influence of TikTok's recommendation algorithm on political content consumption in the context of the 2024 U.S. presidential elections.

Introduction

TikTok is a dominant social media platform with a significant number of users in the U.S., especially among younger demographics. This research investigates potential biases in TikTok's recommendation algorithm during the critical 2024 U.S. presidential elections. Through extensive automated experiments conducted in Texas, New York, and Georgia, the authors assessed how TikTok's algorithm presented partisan content based on users' geographical and political alignment preferences. The paper specifically examined whether the algorithm tends to favor content aligned with either Republican or Democratic ideologies.

Experimental Setup

The experimental setup involved creating controlled "sock puppet" TikTok accounts that simulated user behavior by viewing videos with predefined partisan content. These accounts, distributed across three states with differing political leanings (Texas, New York, and Georgia), were conditioned to watch either Democratic or Republican-aligned videos, followed by recommendations on TikTok's "For You" page. Each experimental cycle lasted for a week and included a conditioning phase, where accounts watched partisan-aligned videos, and a recommendation phase, where TikTok's algorithm suggested additional content. Figure 1

Figure 1: A device's timeline during a weekly experimental run.

Political Content Analysis

The analysis employed LLMs such as GPT-4, GPT-4o, and Gemini-Pro to classify videos based on their political content using the majority vote of results. Videos were assessed for their political nature, connection to the election or key political figures, and general ideological stance. Findings revealed an asymmetrical bias, where Republican-conditioned accounts received 11.8% more ideologically matched recommendations, while Democratic bots saw 7.5% more ideologically opposed content.

Key Findings

The most striking result of the paper was the apparent bias in the recommendation algorithm towards Republican-aligned content. Republican-conditioned accounts consistently received more co-partisan recommendations compared to Democratic-conditioned ones. Particularly, channels and videos containing Anti-Democratic content were prevalent. Figure 2

Figure 2: A comparison between videos with and without transcripts seen by Democrat and Republican bots, respectively.

An exploration of a variety of topics showed systematic discrepancies in topic coverage between Pro-Republican and Pro-Democrat videos. Topics stereotypically aligned with Republicans, such as immigration and foreign policy, received disproportionate coverage from Republican-aligned videos compared to their Democratic counterparts. Figure 3

Figure 3: (A, B) The proportion of videos on a given topic which are ideologically-aligned, ideologically-opposing, or neutral, seen by Democrat- and Republican-conditioned bots, respectively.

Implications and Future Work

The paper raises questions about TikTok's neutrality and the potential implications for shaping political narratives, especially considering its pivotal role in news dissemination to young voters. Understanding these biases could inform the development of more balanced algorithms and enhance oversight mechanisms to ensure equitable content distribution.

Potential avenues for future research include extending the paper to encompass post-election periods, integrating visual content analysis, and comparing TikTok's algorithmic behavior with other platforms. Expanding misinformation research within TikTok could also offer deeper insights into the platform's role in propagating or countering fake news.

Conclusion

Overall, the paper presents critical insights into the nature of content recommendation biases on TikTok during a significant electoral event. The findings spotlight the intricate challenges confronting social media platforms in maintaining neutrality, necessitating ongoing academic and regulatory scrutiny to safeguard democratic practices and informed citizenship.

Slide Deck Streamline Icon: https://streamlinehq.com

Whiteboard

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We found no open problems mentioned in this paper.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 8 tweets and received 131 likes.

Upgrade to Pro to view all of the tweets about this paper:

Youtube Logo Streamline Icon: https://streamlinehq.com
Reddit Logo Streamline Icon: https://streamlinehq.com