Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Wireless Edge Computing with Latency and Reliability Guarantees (1905.05316v1)

Published 13 May 2019 in cs.NI and eess.SP

Abstract: Edge computing is an emerging concept based on distributing computing, storage, and control services closer to end network nodes. Edge computing lies at the heart of the fifth generation (5G) wireless systems and beyond. While current state-of-the-art networks communicate, compute, and process data in a centralized manner (at the cloud), for latency and compute-centric applications, both radio access and computational resources must be brought closer to the edge, harnessing the availability of computing and storage-enabled small cell base stations in proximity to the end devices. Furthermore, the network infrastructure must enable a distributed edge decision-making service that learns to adapt to the network dynamics with minimal latency and optimize network deployment and operation accordingly. This article will provide a fresh look to the concept of edge computing by first discussing the applications that the network edge must provide, with a special emphasis on the ensuing challenges in enabling ultra-reliable and low-latency edge computing services for mission-critical applications such as virtual reality (VR), vehicle-to-everything (V2X), edge AI, and so forth. Furthermore, several case studies where the edge is key are explored followed by insights and prospect for future work.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Mohammed S. Elbamby (13 papers)
  2. Cristina Perfecto (11 papers)
  3. Chen-Feng Liu (23 papers)
  4. Jihong Park (123 papers)
  5. Sumudu Samarakoon (52 papers)
  6. Xianfu Chen (38 papers)
  7. Mehdi Bennis (333 papers)
Citations (121)

Summary

We haven't generated a summary for this paper yet.