Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Reservoir: Named Data for Pervasive Computation Reuse at the Network Edge (2112.12388v1)

Published 23 Dec 2021 in cs.NI

Abstract: In edge computing use cases (e.g., smart cities), where several users and devices may be in close proximity to each other, computational tasks with similar input data for the same services (e.g., image or video annotation) may be offloaded to the edge. The execution of such tasks often yields the same results (output) and thus duplicate (redundant) computation. Based on this observation, prior work has advocated for "computation reuse", a paradigm where the results of previously executed tasks are stored at the edge and are reused to satisfy incoming tasks with similar input data, instead of executing these incoming tasks from scratch. However, realizing computation reuse in practical edge computing deployments, where services may be offered by multiple (distributed) edge nodes (servers) for scalability and fault tolerance, is still largely unexplored. To tackle this challenge, in this paper, we present Reservoir, a framework to enable pervasive computation reuse at the edge, while imposing marginal overheads on user devices and the operation of the edge network infrastructure. Reservoir takes advantage of Locality Sensitive Hashing (LSH) and runs on top of Named-Data Networking (NDN), extending the NDN architecture for the realization of the computation reuse semantics in the network. Our evaluation demonstrated that Reservoir can reuse computation with up to an almost perfect accuracy, achieving 4.25-21.34x lower task completion times compared to cases without computation reuse.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Md Washik Al Azad (10 papers)
  2. Spyridon Mastorakis (41 papers)
Citations (15)

Summary

We haven't generated a summary for this paper yet.