Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
158 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Fairness and Transmission-Aware Caching and Delivery Policies in OFDMA-Based HetNets (1711.02776v3)

Published 8 Nov 2017 in cs.IT and math.IT

Abstract: Recently, wireless edge caching has been emerged as a promising technology for future wireless networks to cope with exponentially increasing demands for high data rate and low latency multimedia services by proactively storing contents at the network edge. Here, we aim to design efficient cache placement and delivery strategies for an orthogonal frequency division multiple access (OFDMA)-based cache-enabled heterogeneous cellular network (C-HetNet) which operates in two separated phases: caching phase (CP) and delivery phase (DP). Since guaranteeing fairness among mobile users (MUs) is not well investigated in cache-assisted wireless networks, we first propose two delay-based fairness schemes called proportional fairness (PF) and min-max fairness (MMF). The PF scheme deals with minimizing the total weighted latency of MUs while MMF aims at minimizing the maximum latency among them. In the CP, we propose a novel proactive fairness and transmission-aware cache placement strategy (CPS) corresponding to each target fairness scheme by exploiting the flexible wireless access and backhaul transmission opportunities. Specifically, we jointly perform the allocation of physical resources as storage and radio, and user association to improve the flexibility of the CPSs. Moreover, In the DP of each fairness scheme, an efficient delivery policy is proposed based on the arrival requests of MUs, CSI, and caching status. Numerical assessments demonstrate that our proposed CPSs outperform the total latency of MUs up to 27% compared to the conventional baseline popular CPSs.

Citations (28)

Summary

We haven't generated a summary for this paper yet.