Papers
Topics
Authors
Recent
Search
2000 character limit reached

Optimistic Online Caching for Batched Requests

Published 2 Oct 2023 in cs.NI | (2310.01309v1)

Abstract: In this paper we study online caching problems where predictions of future requests, e.g., provided by a machine learning model, are available. Typical online optimistic policies are based on the Follow-The-Regularized-Leader algorithm and have higher computational cost than classic ones like LFU, LRU, as each update of the cache state requires to solve a constrained optimization problem. In this work we analysed the behaviour of two different optimistic policies in a \textit{batched} case, i.e., when the cache is updated less frequently in order to amortize the update cost over time or over multiple requests. Experimental results show that such an optimistic batched approach outperforms classical caching policies both on stationary and real traces

Citations (2)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.