Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
194 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

This is not the End: Rethinking Serverless Function Termination (2211.02330v1)

Published 4 Nov 2022 in cs.DC and cs.PL

Abstract: Elastic scaling is one of the central benefits provided by serverless platforms, and requires that they scale resource up and down in response to changing workloads. Serverless platforms scale-down resources by terminating previously launched instances (which are containers or processes). The serverless programming model ensures that terminating instances is safe assuming all application code running on the instance has either completed or timed out. Safety thus depends on the serverless platform's correctly determining that application processing is complete. In this paper, we start with the observation that current serverless platforms do not account for pending asynchronous I/O operations when determining whether application processing is complete. These platforms are thus unsafe when executing programs that use asynchronous I/O, and incorrectly deciding that application processing has terminated can result in data inconsistency when these platforms are used. We show that the reason for this problem is that current serverless semantics couple termination and response generation in serverless applications. We address this problem by proposing an extension to current semantics that decouples response generation and termination, and demonstrate the efficacy and benefits of our proposal by extending OpenWhisk, an open source serverless platform.

Summary

We haven't generated a summary for this paper yet.