Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On the complexity of cache analysis for different replacement policies (1811.01740v2)

Published 5 Nov 2018 in cs.PL, cs.AR, and cs.CC

Abstract: Modern processors use cache memory: a memory access that "hits" the cache returns early, while a "miss" takes more time. Given a memory access in a program, cache analysis consists in deciding whether this access is always a hit, always a miss, or is a hit or a miss depending on execution. Such an analysis is of high importance for bounding the worst-case execution time of safety-critical real-time programs.There exist multiple possible policies for evicting old data from the cache when new data are brought in, and different policies, though apparently similar in goals and performance, may be very different from the analysis point of view. In this paper, we explore these differences from a complexity-theoretical point of view. Specifically, we show that, among the common replacement policies, LRU (Least Recently Used) is the only one whose analysis is NP-complete, whereas the analysis problems for the other policies are PSPACE-complete.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. David Monniaux (48 papers)
  2. Valentin Touzeau (5 papers)
Citations (4)

Summary

We haven't generated a summary for this paper yet.