Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Determinism, Complexity, and Predictability in Computer Performance (1305.5408v1)

Published 23 May 2013 in nlin.CD, cs.IT, cs.PF, and math.IT

Abstract: Computers are deterministic dynamical systems (CHAOS 19:033124, 2009). Among other things, that implies that one should be able to use deterministic forecast rules to predict their behavior. That statement is sometimes-but not always-true. The memory and processor loads of some simple programs are easy to predict, for example, but those of more-complex programs like compilers are not. The goal of this paper is to determine why that is the case. We conjecture that, in practice, complexity can effectively overwhelm the predictive power of deterministic forecast models. To explore that, we build models of a number of performance traces from different programs running on different Intel-based computers. We then calculate the permutation entropy-a temporal entropy metric that uses ordinal analysis-of those traces and correlate those values against the prediction success

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Joshua Garland (35 papers)
  2. Ryan James (3 papers)
  3. Elizabeth Bradley (28 papers)
Citations (4)

Summary

We haven't generated a summary for this paper yet.