Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Mitigating Cold Starts in Serverless Platforms: A Pool-Based Approach (1903.12221v1)

Published 28 Mar 2019 in cs.DC

Abstract: Rapid adoption of the serverless (or Function-as-a-Service, FaaS) paradigm, pioneered by Amazon with AWS Lambda and followed by numerous commercial offerings and open source projects, introduces new challenges in designing the cloud infrastructure, balancing between performance and cost. While instant per-request elasticity that FaaS platforms typically offer application developers makes it possible to achieve high performance of bursty workloads without over-provisioning, such elasticity often involves extra latency associated with on-demand provisioning of individual runtime containers that serve the functions. This phenomenon is often called cold starts, as opposed to the situation when a function is served by a pre-provisioned "warm" container, ready to serve requests with close to zero overhead. Providers are constantly working on techniques aimed at reducing cold starts. A common approach to reduce cold starts is to maintain a pool of warm containers, in anticipation of future requests. In this report, we address the cold start problem in serverless architectures, specifically under the Knative Serving FaaS platform. We describe our implementation leveraging a pool of function instances, and evaluate the latency compared to the original implementation, resulting in a 85% reduction of P99 response time for a single instance pool.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Ping-Min Lin (1 paper)
  2. Alex Glikson (1 paper)
Citations (60)

Summary

We haven't generated a summary for this paper yet.