Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
175 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Energy-Efficient Proactive Caching with Multipath Routing (2104.13493v2)

Published 27 Apr 2021 in cs.NI

Abstract: The ever-continuing explosive growth of on-demand content distribution has imposed great pressure on mobile/wireless network infrastructures. To ease congestion in the network and to increase perceived user experience, caching of popular content closer to the end-users can play a significant role and as such this issue has received significant attention over the last few years. Additionally, energy efficiency is treated as a fundamental requirement in the design of next-generation mobile networks. However, there has been little attention to the overlapping area between energy efficiency and network caching especially when considering multipath routing. To this end, this paper proposes an energy-efficient caching with multipath routing support. The proposed scheme provides a joint anchoring of popular content into a set of potential caching nodes with optimized multipath support whilst ensuring a balance between transmission and caching energy cost. The proposed model also considers different content delivery modes, such as multicast and unicast. Two separated Integer-Linear Programming (ILP) models are formulated for each delivery mode. To tackle the curse of dimensionality we then provide a greedy simulated annealing algorithm, which not only reduces the time complexity but also provides a competitive performance. A wide set of numerical investigations reveal that the proposed scheme reduces the energy consumption up to 80% compared with other widely used caching approaches under the premise of network resource limitation. Sensitivity analysis to different parameters is also meticulously discussed in this paper.

Citations (4)

Summary

We haven't generated a summary for this paper yet.