"They've Over-Emphasized That One Search": Controlling Unwanted Content on TikTok's For You Page
Abstract: Modern algorithmic recommendation systems seek to engage users through behavioral content-interest matching. While many platforms recommend content based on engagement metrics, others like TikTok deliver interest-based content, resulting in recommendations perceived to be hyper-personalized compared to other platforms. TikTok's robust recommendation engine has led some users to suspect that the algorithm knows users "better than they know themselves," but this is not always true. In this paper, we explore TikTok users' perceptions of recommended content on their For You Page (FYP), specifically calling attention to unwanted recommendations. Through qualitative interviews of 14 current and former TikTok users, we find themes of frustration with recommended content, attempts to rid themselves of unwanted content, and various degrees of success in eschewing such content. We discuss implications in the larger context of folk theorization and contribute concrete tactical and behavioral examples of algorithmic persistence.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.