Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Delay-Energy Joint Optimization for Task Offloading in Mobile Edge Computing (1804.10416v1)

Published 27 Apr 2018 in cs.NI

Abstract: Mobile-edge computing (MEC) has been envisioned as a promising paradigm to meet ever-increasing resource demands of mobile users, prolong battery lives of mobile devices, and shorten request response delays experienced by users. An MEC environment consists of many MEC servers and ubiquitous access points interconnected into an edge cloud network. Mobile users can offload their computing-intensive tasks to one or multiple MEC servers for execution to save their batteries. Due to large numbers of MEC servers deployed in MEC, selecting a subset of servers to serve user tasks while satisfying delay requirements of their users is challenging. In this paper, we formulate a novel delay-energy joint optimization problem through jointly considering the CPU-cycle frequency scheduling at mobile devices, server selection to serve user offloading tasks, and task allocations to the selected servers. To this end, we first formulate the problem as a mixed-integer nonlinear programming, due to the hardness to solve this nonlinear programming, we instead then relax the problem into a nonlinear programming problem that can be solved in polynomial time. We also show how to derive a feasible solution to the original problem from the solution of this relaxed solution. We finally conduct experiments to evaluate the performance of the proposed algorithm. Experimental results demonstrate that the proposed algorithm is promising.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Zhuang Wang (21 papers)
  2. Weifa Liang (8 papers)
  3. Meitian Huang (1 paper)
  4. Yu Ma (46 papers)
Citations (10)

Summary

We haven't generated a summary for this paper yet.