Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Workload Scheduling on heterogeneous Mobile Edge Cloud in 5G networks to Minimize SLA Violation (2003.02820v2)

Published 5 Mar 2020 in cs.DC and cs.NI

Abstract: Smart devices have become an indispensable part of our lives and gain increasing applicability in almost every area. Latency-aware applications such as Augmented Reality (AR), autonomous driving, and online gaming demand more resources such as network bandwidth and computational capabilities. Since the traditional mobile networks cannot fulfill the required bandwidth and latency, Mobile Edge Cloud (MEC) emerged to provide cloud computing capabilities in the proximity of users on 5G networks. In this paper, we consider a heterogeneous MEC network with numerous mobile users that send their tasks to MEC servers. Each task has a maximum acceptable response time. Non-uniform distribution of users makes some MEC servers hotspots that cannot take more. A solution is to relocate the tasks among MEC servers, called Workload Migration. We formulate this problem of task scheduling as a mixed-integer non-linear optimization problem to minimize the number of Service Level Agreement (SLA) violations. Since solving this optimization problem has high computational complexity, we introduce a greedy algorithm called MESA, Migration Enabled Scheduling Algorithm, which reaches a near-optimal solution quickly. Our experiments show that in the term of SLA violation, MESA is only 8% and 11% far from the optimal choice on the average and the worst-case, respectively. Moreover, the migration enabled solution can reduce SLA violations by about 30% compare to assigning tasks to MEC servers without migration.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
Citations (3)

Summary

We haven't generated a summary for this paper yet.