Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

AoI-Delay Tradeoff in Mobile Edge Caching with Freshness-Aware Content Refreshing (2002.05868v1)

Published 14 Feb 2020 in cs.NI, cs.IT, and math.IT

Abstract: Mobile edge caching can effectively reduce service delay but may introduce information staleness, calling for timely content refreshing. However, content refreshing consumes additional transmission resources and may degrade the delay performance of mobile systems. In this work, we propose a freshness-aware refreshing scheme to balance the service delay and content freshness measured by Age of Information (AoI). Specifically, the cached content items will be refreshed to the up-to-date version upon user requests if the AoI exceeds a certain threshold (named as refreshing window). The average AoI and service delay are derived in closed forms approximately, which reveals an AoI-delay tradeoff relationship with respect to the refreshing window. In addition, the refreshing window is optimized to minimize the average delay while meeting the AoI requirements, and the results indicate to set a smaller refreshing window for the popular content items. Extensive simulations are conducted on the OMNeT++ platform to validate the analytical results. The results indicate that the proposed scheme can restrain frequent refreshing as the request arrival rate increases, whereby the average delay can be reduced by around 80% while maintaining the AoI below one second in heavily-loaded scenarios.

Citations (45)

Summary

We haven't generated a summary for this paper yet.