Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 49 tok/s
Gemini 2.5 Pro 53 tok/s Pro
GPT-5 Medium 19 tok/s Pro
GPT-5 High 16 tok/s Pro
GPT-4o 103 tok/s Pro
Kimi K2 172 tok/s Pro
GPT OSS 120B 472 tok/s Pro
Claude Sonnet 4 39 tok/s Pro
2000 character limit reached

Almost-catalytic Computation (2409.07208v2)

Published 11 Sep 2024 in cs.CC

Abstract: Designing algorithms for space bounded models with restoration requirements on the space used by the algorithm is an important challenge posed about the catalytic computation model introduced by Buhrman et al. (2014). Motivated by the scenarios where we do not need to restore unless is useful, we define $ACL(A)$ to be the class of languages that can be accepted by almost-catalytic Turing machines with respect to $A$ (which we call the catalytic set), that uses at most $c\log n$ work space and $nc$ catalytic space. We show that if there are almost-catalytic algorithms for a problem with catalytic set as $A \subseteq \Sigma*$ and its complement respectively, then the problem can be solved by a ZPP algorithm. Using this, we derive that to design catalytic algorithms, it suffices to design almost-catalytic algorithms where the catalytic set is the set of strings of odd weight ($PARITY$). Towards this, we consider two complexity measures of the set $A$ which are maximized for $PARITY$ - random projection complexity (${\cal R}(A)$) and the subcube partition complexity (${\cal P}(A)$). By making use of error-correcting codes, we show that for all $k \ge 1$, there is a language $A_k \subseteq \Sigma*$ such that $DSPACE(nk) \subseteq ACL(A_k)$ where for every $m \ge 1$, $\mathcal{R}(A_k \cap {0,1}m) \ge \frac{m}{4}$ and $\mathcal{P}(A_k \cap {0,1}m)=2{m/4}$. This contrasts the catalytic machine model where it is unclear if it can accept all languages in $DSPACE(\log{1+\epsilon} n)$ for any $\epsilon > 0$. Improving the partition complexity of the catalytic set $A$ further, we show that for all $k \ge 1$, there is a $A_k \subseteq {0,1}*$ such that $\mathsf{DSPACE}(\logk n) \subseteq ACL(A_k)$ where for every $m \ge 1$, $\mathcal{R}(A_k \cap {0,1}m) \ge \frac{m}{4}$ and $\mathcal{P}(A_k \cap {0,1}m)=2{m/4+\Omega(\log m)}$.

Summary

We haven't generated a summary for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube