Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
120 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Contrasting Web Robot and Human Behaviors with Network Models (1801.09715v1)

Published 29 Jan 2018 in cs.SI

Abstract: The web graph is a commonly-used network representation of the hyperlink structure of a website. A network of similar structure to the web graph, which we call the session graph has properties that reflect the browsing habits of the agents in the web server logs. In this paper, we apply session graphs to compare the activity of humans against web robots or crawlers. Understanding these properties will enable us to improve models of HTTP traffic, which can be used to predict and generate realistic traffic for testing and improving web server efficiency, as well as devising new caching algorithms. We apply large-scale network properties, such as the connectivity and degree distribution of human and Web robot session graphs in order to identify characteristics of the traffic which would be useful for modeling web traffic and improving cache performance. We find that the empirical degree distributions of session graphs for human and robot requests on one Web server are best fit by different theoretical distributions, indicating at a difference in the processes which generate the traffic.

Citations (5)

Summary

We haven't generated a summary for this paper yet.