Papers
Topics
Authors
Recent
Search
2000 character limit reached

Adaptive Task Partitioning at Local Device or Remote Edge Server for Offloading in MEC

Published 12 Feb 2020 in cs.NI and eess.SP | (2002.04858v1)

Abstract: Mobile edge computing (MEC) is one of the promising solutions to process computational-intensive tasks for the emerging time-critical Internet-of-Things (IoT) use cases, e.g., virtual reality (VR), augmented reality (AR), autonomous vehicle. The latency can be reduced further, when a task is partitioned and computed by multiple edge servers' (ESs) collaboration. However, the state-of-the-art work studies the MEC-enabled offloading based on a static framework, which partitions tasks at either the local user equipment (UE) or the primary ES. The dynamic selection between the two offloading schemes has not been well studied yet. In this paper, we investigate a dynamic offloading framework in a multi-user scenario. Each UE can decide who partitions a task according to the network status, e.g., channel quality and allocated computation resource. Based on the framework, we model the latency to complete a task, and formulate an optimization problem to minimize the average latency among UEs. The problem is solved by jointly optimizing task partitioning and the allocation of the communication and computation resources. The numerical results show that, compared with the static offloading schemes, the proposed algorithm achieves the lower latency in all tested scenarios. Moreover, both mathematical derivation and simulation illustrate that the wireless channel quality difference between a UE and different ESs can be used as an important criterion to determine the right scheme.

Citations (13)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.