Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Leveraging the Flow of Collective Attention for Computational Communication Research (1710.07761v1)

Published 21 Oct 2017 in cs.CY and cs.SI

Abstract: Human attention becomes an increasingly important resource for our understanding or collective human behaviors in the age of information explosion. To better understand the flow of collective attention, we construct the attention flow network using anonymous smartphone data of 100,000 users in a major city of China. In the constructed network, nodes are websites visited by users, and links denote the switch of users between two websites. We quantify the flow of collective attention by computing the flow network statistics, such as flow impact, flow dissipation, and flow distance. The findings reveal a strong concentration and fragmentation of collective attention for smartphone users, while the duplication of attention cross websites proves to be unfounded in mobile using. We further confirmed the law of dissipation and the allowmetric scaling of flow impact. Surprisingly, there is a centralized flow structure, suggesting that the website with large traffic can easily control the circulated collective attention. Additionally, we find that flow network analysis can effectively explain the page views and sale volume of products. Finally, we discuss the benefits and limitations of using the flow network analysis for computational communication research.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Cheng-Jun Wang (8 papers)
  2. Zhi-Cong Chen (1 paper)
  3. Qiang Qin (2 papers)
  4. Naipeng Chao (1 paper)

Summary

We haven't generated a summary for this paper yet.