Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
143 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Mobility-Aware Vehicular Caching Scheme in Content Centric Networks: Model and Optimization (1902.07014v1)

Published 19 Feb 2019 in cs.NI

Abstract: Edge caching is being explored as a promising technology to alleviate the network burden of cellular networks by separating the computing functionalities away from cellular base stations. However, the service capability of existing caching scheme is limited by fixed edge infrastructure when facing the uncertainties of users' requests and locations. The vehicular caching, which uses the moving vehicles as cache carriers, is regard as an efficient method to solve the problem above. This paper studies the effectiveness of vehicular caching scheme in content centric networks by developing optimization model towards the minimization of network energy consumption. Particularly, we model the interactions between caching vehicles and mobile users as a 2-D Markov process, in order to characterize the network availability of mobile users. Based on the developed model, we propose an online vehicular caching design by optimizing network energy efficiency. Specifically, the problem of caching decision making is firstly formulated as a fractional optimization model, towards the optimal energy efficiency. Using nonlinear fractional programming technology and Lyapunov optimization theory, we derive the theoretical solution for the optimization model. An online caching algorithm to enable the optimal vehicular caching is developed based on the solution. Finally, extensive simulations are conducted to examine the performance of our proposal. By comparison, our online caching scheme outperforms the existing scheme in terms of energy efficiency, hit ratio, cache utilization, and system gain.

Citations (72)

Summary

We haven't generated a summary for this paper yet.