Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Joint Task Offloading and Resource Allocation for IoT Edge Computing with Sequential Task Dependency (2110.12115v2)

Published 23 Oct 2021 in eess.SY and cs.SY

Abstract: Incorporating mobile edge computing (MEC) in the Internet of Things (IoT) enables resource-limited IoT devices to offload their computation tasks to a nearby edge server. In this paper, we investigate an IoT system assisted by the MEC technique with its computation task subjected to sequential task dependency, which is critical for video stream processing and other intelligent applications. To minimize energy consumption per IoT device while limiting task processing delay, task offloading strategy, communication resource, and computation resource are optimized jointly under both slow and fast fading channels. In slow fading channels, an optimization problem is formulated, which is mixed-integer and non-convex. To solve this challenging problem, we decompose it as a one-dimensional search of task offloading decision problem and a non-convex optimization problem with task offloading decision given. Through mathematical manipulations, the non-convex problem is transformed to be a convex one, which is shown to be solvable only with the simple Golden search method. In fast fading channels, optimal online policy depending on instant channel state is derived. In addition, it is proved that the derived policy will converge to the offline policy when channel coherence time is low, which can help to save extra computation complexity. Numerical results verify the correctness of our analysis and the effectiveness of our proposed strategies over existing methods.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Xuming An (6 papers)
  2. Rongfei Fan (28 papers)
  3. Han Hu (196 papers)
  4. Ning Zhang (278 papers)
  5. Saman Atapattu (35 papers)
  6. Theodoros A. Tsiftsis (69 papers)