Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

PA-Cache: Evolving Learning-Based Popularity-Aware Content Caching in Edge Networks (2002.08805v2)

Published 20 Feb 2020 in cs.NI and cs.LG

Abstract: As ubiquitous and personalized services are growing boomingly, an increasingly large amount of traffic is generated over the network by massive mobile devices. As a result, content caching is gradually extending to network edges to provide low-latency services, improve quality of service, and reduce redundant data traffic. Compared to the conventional content delivery networks, caches in edge networks with smaller sizes usually have to accommodate more bursty requests. In this paper, we propose an evolving learning-based content caching policy, named PA-Cache in edge networks. It adaptively learns time-varying content popularity and determines which contents should be replaced when the cache is full. Unlike conventional deep neural networks (DNNs), which learn a fine-tuned but possibly outdated or biased prediction model using the entire training dataset with high computational complexity, PA-Cache weighs a large set of content features and trains the multi-layer recurrent neural network from shallow to deeper when more requests arrive over time. We extensively evaluate the performance of our proposed PA-Cache on real-world traces from a large online video-on-demand service provider. \rb{The results show that PA-Cache outperforms existing popular caching algorithms and approximates the optimal algorithm with only a 3.8\% performance gap when the cache percentage is 1.0\%}. PA-Cache also significantly reduces the computational cost compared to conventional DNN-based approaches.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Qilin Fan (5 papers)
  2. Xiuhua Li (8 papers)
  3. Jian Li (667 papers)
  4. Qiang He (38 papers)
  5. Kai Wang (624 papers)
  6. Junhao Wen (22 papers)
Citations (6)

Summary

We haven't generated a summary for this paper yet.