Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On Combination Networks with Cache-aided Relays and Users (1803.06123v1)

Published 16 Mar 2018 in cs.IT and math.IT

Abstract: Caching is an efficient way to reduce peak hour network traffic congestion by storing some contents at the user's cache without knowledge of later demands. Coded caching strategy was originally proposed by Maddah-Ali and Niesen to give an additional coded caching gain compared the conventional uncoded scheme. Under practical consideration, the caching model was recently considered in relay network, in particular the combination network, where the central server communicates with $K=\binom{H}{r}$ users (each is with a cache of $M$ files) through $H$ immediate relays, and each user is connected to a different $r-$subsets of relays. Several inner bounds and outer bounds were proposed for combination networks with end-user-caches. This paper extends the recent work by the authors on centralized combination networks with end-user caches to a more general setting, where both relays and users have caches. In contrast to the existing schemes in which the packets transmitted from the server are independent of the cached contents of relays, we propose a novel caching scheme by creating an additional coded caching gain to the transmitted load from the server with the help of the cached contents in relays. We also show that the proposed scheme outperforms the state-of-the-art approaches.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Kai Wan (67 papers)
  2. Daniela Tuninetti (89 papers)
  3. Pablo Piantanida (129 papers)
  4. Mingyue Ji (86 papers)
Citations (3)

Summary

We haven't generated a summary for this paper yet.