Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On Caching with More Users than Files (1601.06383v2)

Published 24 Jan 2016 in cs.IT and math.IT

Abstract: Caching appears to be an efficient way to reduce peak hour network traffic congestion by storing some content at the user's cache without knowledge of later demands. Recently, Maddah-Ali and Niesen proposed a two-phase, placement and delivery phase, coded caching strategy for centralized systems (where coordination among users is possible in the placement phase), and for decentralized systems. This paper investigates the same setup under the further assumption that the number of users is larger than the number of files. By using the same uncoded placement strategy of Maddah-Ali and Niesen, a novel coded delivery strategy is proposed to profit from the multicasting opportunities that arise because a file may be demanded by multiple users. The proposed delivery method is proved to be optimal under the constraint of uncoded placement for centralized systems with two files, moreover it is shown to outperform known caching strategies for both centralized and decentralized systems.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Kai Wan (67 papers)
  2. Daniela Tuninetti (89 papers)
  3. Pablo Piantanida (129 papers)
Citations (72)

Summary

We haven't generated a summary for this paper yet.