Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Online Caching and Coding at the WiFi Edge: Gains and Tradeoffs (2001.07334v1)

Published 21 Jan 2020 in cs.NI

Abstract: Video content delivery at the wireless edge continues to be challenged by insufficient bandwidth and highly dynamic user behavior which affects both effective throughput and latency. Caching at the network edge and coded transmissions have been found to improve user performance of video content delivery. The cache at the wireless edge stations (BSs, APs) and at the users' end devices can be populated by pre-caching content or by using online caching policies. In this paper, we propose a system where content is cached at the user of a WiFi network via online caching policies, and coded delivery is employed by the WiFi AP to deliver the requested content to the user population. The content of the cache at the user serves as side information for index coding. We also propose the LFU-Index cache replacement policy at the user that demonstrably improves index coding opportunities at the WiFi AP for the proposed system. Through an extensive simulation study, we determine the gains achieved by caching and index by coding. Next, we analyze the tradeoffs between them in terms of data transmitted, latency, and throughput for different content request behaviors from the users. We also show that the proposed cache replacement policy performs better than traditional cache replacement policies like LRU and LFU.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Lalhruaizela Chhangte (2 papers)
  2. Emanuele Viterbo (83 papers)
  3. Nikhil Karamchandani (46 papers)
  4. D Manjunath (4 papers)
Citations (5)

Summary

We haven't generated a summary for this paper yet.