Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
162 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

NV-Fogstore : Device-aware hybrid caching in fog computing environments (2010.10562v1)

Published 20 Oct 2020 in cs.NI and cs.AR

Abstract: Edge caching via the placement of distributed storages throughout the network is a promising solution to reduce latency and network costs of content delivery. With the advent of the upcoming 5G future, billions of F-RAN (Fog-Radio Access Network) nodes will created and used for for the purpose of Edge Caching. Hence, the total amount of memory deployed at the edge is expected to increase 100 times. Currently, used DRAM-based caches in CDN (Content Delivery Networks) are extremely power-hungry and costly. Our purpose is to reduce the cost of ownership and recurring costs (of power consumption) in an F-RAN node while maintaining Quality of Service. For our purpose, we propose NV-FogStore, a scalable hybrid key-value storage architecture for the utilization of Non-Volatile Memories (such as RRAM, MRAM, Intel Optane) in Edge Cache. We further describe in detail a novel, hierarchical, write-damage, size and frequency aware content caching policy H-GREEDY for our architecture. We show that our policy can be tuned as per performance objectives, to lower the power, energy consumption and total cost over an only DRAM-based system for only a relatively smaller trade-off in the average access latency.

Citations (5)

Summary

We haven't generated a summary for this paper yet.