Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
126 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Latency Minimization for Task Offloading in Hierarchical Fog-Computing C-RAN Networks (2003.11685v1)

Published 26 Mar 2020 in cs.IT, eess.SP, and math.IT

Abstract: Fog-computing network combines the cloud computing and fog access points (FAPs) equipped with mobile edge computing (MEC) servers together to support computation-intensive tasks for mobile users. However, as FAPs have limited computational capabilities and are solely assisted by a remote cloud center in the baseband processing unit (BBU) of the cloud radio access (C-RAN) network, the latency benefits of this fog-computing C-RAN network may be worn off when facing a large number of offloading requests. In this paper, we investigate the delay minimization problem for task offloading in a hierarchical fog-computing C-RAN network, which consists of three tiers of computational services: MEC server in radio units, MEC server in distributed units, and the cloud computing in central units. The receive beamforming vectors, task allocation, computing speed for offloaded tasks in each server and the transmission bandwidth split of fronthaul links are optimized by solving the formulated mixed integer programming problem. The simulation results validate the superiority of the proposed hierarchical fog-computing C-RAN network in terms of the delay performance.

Citations (14)

Summary

We haven't generated a summary for this paper yet.