Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
175 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Energy Complexity of Regular Languages (2204.06025v2)

Published 12 Apr 2022 in cs.CC

Abstract: Each step that results in a bit of information being forgotten'' by a computing device has an intrinsic energy cost. Although any Turing machine can be rewritten to be thermodynamically reversible without changing the recognized language, finite automata that are restricted to scan their input once inreal-time'' fashion can only recognize the members of a proper subset of the class of regular languages in this reversible manner. We study the energy expenditure associated with the computations of deterministic and quantum finite automata. We prove that zero-error quantum finite automata have no advantage over their classical deterministic counterparts in terms of the maximum obligatory thermodynamic cost associated by any step during the recognition of different regular languages. We also demonstrate languages for which ``error can be traded for energy'', i.e. whose zero-error recognition is associated with computation steps having provably bigger obligatory energy cost when compared to their bounded-error recognition by real-time finite-memory quantum devices. We show that regular languages can be classified according to the intrinsic energy requirements on the recognizing automaton as a function of input length, and prove upper and lower bounds.

Summary

We haven't generated a summary for this paper yet.