Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Learning to Cache: Distributed Coded Caching in a Cellular Network With Correlated Demands (2010.06195v1)

Published 13 Oct 2020 in cs.IT and math.IT

Abstract: Design of distributed caching mechanisms is considered as an active area of research due to its promising solution in reducing data load in the backhaul link of a cellular network. In this paper, the problem of distributed content caching in a small-cell Base Stations (sBSs) wireless network that maximizes the cache hit performance is considered. Most of the existing works focus on static demands, however, here, data at each sBS is considered to be correlated across time and sBSs. The caching strategy is assumed to be a weighted combination of past caching strategies. A high probability generalization guarantees on the performance of the proposed caching strategy is derived. The theoretical guarantee provides following insights on obtaining the caching strategy: (i) run regret minimization at each sBS to obtain a sequence of caching strategies across time, and (ii) maximize an estimate of the bound to obtain a set of weights for the caching strategy which depends on the discrepancy. Also, theoretical guarantee on the performance of the LRFU caching strategy is derived. Further, federated learning based heuristic caching algorithm is also proposed. Finally, it is shown through simulations using Movie Lens dataset that the proposed algorithm significantly outperforms LRFU algorithm.

Citations (1)

Summary

We haven't generated a summary for this paper yet.