Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Demystifying Fog Computing: Characterizing Architectures, Applications and Abstractions (1702.06331v1)

Published 21 Feb 2017 in cs.DC

Abstract: Internet of Things (IoT) has accelerated the deployment of millions of sensors at the edge of the network, through Smart City infrastructure and lifestyle devices. Cloud computing platforms are often tasked with handling these large volumes and fast streams of data from the edge. Recently, Fog computing has emerged as a concept for low-latency and resource-rich processing of these observation streams, to complement Edge and Cloud computing. In this paper, we review various dimensions of system architecture, application characteristics and platform abstractions that are manifest in this Edge, Fog and Cloud eco-system. We highlight novel capabilities of the Edge and Fog layers, such as physical and application mobility, privacy sensitivity, and a nascent runtime environment. IoT application case studies based on first-hand experiences across diverse domains drive this categorization. We also highlight the gap between the potential and the reality of Fog computing, and identify challenges that need to be overcome for the solution to be sustainable. Together, our article can help platform and application developers bridge the gap that remains in making Fog computing viable.

Characterization and Challenges of Fog Computing

In their paper, "Demystifying Fog Computing: Characterizing Architectures, Applications and Abstractions," Varshney and Simmhan delve into the intricacies of fog computing, a nascent paradigm aimed at extending cloud capabilities to the edge of networks within the Internet of Things (IoT) ecosystem. This research seeks to discern the architectural, application, and abstraction characteristics necessary to integrate fog computing seamlessly between edge and cloud systems.

System Architecture

Fog computing emerges as a concept intended to address latency and resource constraints that impact edge and cloud computing. The authors present fog computing as an intermediary infrastructure capable of managing high-volume data streams from IoT sensor networks. It operates in close proximity to edge devices, offering lower latency and higher bandwidth compared to cloud resources, and facilitating local analytics and data caching. This positioning allows fog computing to perform resource-rich processing while maintaining operational connectivity to both edge devices and cloud data centers.

Application Domain

The research emphasizes fog computing's role in IoT and smart city applications, where data from widespread edge devices, such as video cameras and sensors, necessitates efficient processing and analysis. Fog computing is presented as an effective method to alleviate the latency and network overhead involved in transmitting extensive data volumes to the cloud, particularly when real-time or near-real-time analytics are required. Use cases such as urban surveillance and smart grids illustrate these benefits, showcasing fog computing’s potential to drive strategic decisions based on contextual data processed closer to edge devices.

System and Application Abstractions

An intrinsic part of fog computing's efficacy lies in the abstractions provided for the coordination and composition of applications across edge, fog, and cloud levels. Efficient orchestration strategies, whether centralized, hierarchical, or peer-to-peer, must accommodate dynamic application demands, network conditions, and varying resource capacities. These strategies ensure optimized task distribution and resilient application performance amidst the inherent mobility and computing heterogeneity of fog environments.

Challenges for Implementation

The paper rigorously outlines the challenges fog computing must overcome to transition from conceptual models to viable commercial solutions. Key challenges include programming interfaces that simplify application development on fog infrastructures, ensuring security and fault tolerance amid distributed and heterogeneous devices, and establishing economically sustainable service models. Additionally, the ability to predict user demands and optimally deploy fog resources is crucial for efficient management and scalable services.

Future Prospects

Although fog computing has not yet been widely deployed on a commercial scale, its anticipated role in IoT ecosystems continues to evolve. As fog computing capabilities mature and infrastructure providers align resources closer to edge devices, it holds potential for enabling new application scenarios previously constrained by latency and bandwidth limitations. Future developments are likely to see fog computing integrated more deeply in smart city infrastructures and potentially leveraged by network operators and IoT service providers.

In summary, Varshney and Simmhan's paper offers a comprehensive overview of fog computing, its theoretical underpinnings, practical applications, and the challenges it faces. As IoT continues its trajectory of rapid growth, fog computing could provide the critical bridge needed between edge sensors and centralized cloud infrastructures, empowering real-time analytics and rich data processing at unprecedented scales.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Prateeksha Varshney (4 papers)
  2. Yogesh Simmhan (59 papers)
Citations (164)