Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Cold Start Latency in Serverless Computing: A Systematic Review, Taxonomy, and Future Directions (2310.08437v2)

Published 12 Oct 2023 in cs.DC

Abstract: Recently, academics and the corporate sector have paid attention to serverless computing, which enables dynamic scalability and an economic model. In serverless computing, users only pay for the time they actually use resources, enabling zero scaling to optimise cost and resource utilisation. However, this approach also introduces the serverless cold start problem. Researchers have developed various solutions to address the cold start problem, yet it remains an unresolved research area. In this article, we propose a systematic literature review on clod start latency in serverless computing. Furthermore, we create a detailed taxonomy of approaches to cold start latency, which we use to investigate existing techniques for reducing the cold start time and frequency. We have classified the current studies on cold start latency into several categories such as caching and application-level optimisation-based solutions, as well as AI/Machine Learning (ML)-based solutions. Moreover, we have analyzed the impact of cold start latency on quality of service, explored current cold start latency mitigation methods, datasets, and implementation platforms, and classified them into categories based on their common characteristics and features. Finally, we outline the open challenges and highlight the possible future directions.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Muhammed Golec (9 papers)
  2. Guneet Kaur Walia (2 papers)
  3. Mohit Kumar (53 papers)
  4. Felix Cuadrado (14 papers)
  5. Sukhpal Singh Gill (39 papers)
  6. Steve Uhlig (25 papers)
Citations (8)

Summary

We haven't generated a summary for this paper yet.