Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 186 tok/s
Gemini 2.5 Pro 48 tok/s Pro
GPT-5 Medium 34 tok/s Pro
GPT-5 High 32 tok/s Pro
GPT-4o 65 tok/s Pro
Kimi K2 229 tok/s Pro
GPT OSS 120B 441 tok/s Pro
Claude Sonnet 4.5 38 tok/s Pro
2000 character limit reached

Conscious Turing Machine: Computational Model

Updated 4 November 2025
  • The paper introduces the CTM as a computational framework that formalizes consciousness via a seven-tuple structure integrating STM, LTM, and tree-based selection.
  • It employs a tournament-based mechanism where LTM processors compete probabilistically to determine the single conscious chunk in STM.
  • The model supports practical applications in AGI by simulating phenomena such as attention, dreams, and free will through distributed, adaptive processing.

The Conscious Turing Machine (CTM) is a mathematically formalized, substrate-independent computational model designed to explicate consciousness through the principles of theoretical computer science. Unlike biological or cognitive models, the CTM emphasizes architectural simplicity, transparent mechanism, and rigorous definition. Grounded in Alan Turing’s model of computation and inspired by Bernard Baars' Global Workspace Theory (GWT) as well as the Global Neuronal Workspace (GNW) framework, the CTM provides an operational framework for consciousness, subjective experience, and artificial general intelligence (AGI) (Blum et al., 2023, Blum et al., 2020, Blum et al., 2021, Cui et al., 22 Oct 2024, Blum et al., 25 Mar 2024).

1. Formal Structure and Computational Foundations

The CTM is defined as a seven-tuple: CTM=STM,LTM,UpTree,DownTree,Links,Input,Output\mathrm{CTM} = \langle \mathrm{STM},\, \mathrm{LTM},\, \mathrm{UpTree},\, \mathrm{DownTree},\, \mathrm{Links},\, \mathrm{Input},\, \mathrm{Output} \rangle

  • STM (Short-Term Memory): Holds exactly one chunk of information per cycle; this chunk constitutes the complete conscious content at that instant.
  • LTM (Long-Term Memory): An array of NN massive, parallel processors (typically N107N \geq 10^7), each running their own prediction- and learning-driven algorithms.
  • Chunk: The unit of conscious content, formalized as a six-tuple:

chunk=address,t,gist,weight,intensity,mood\text{chunk} = \langle \text{address}, t, \text{gist}, \text{weight}, \text{intensity}, \text{mood} \rangle

  • address: processor identifier
  • t: time of creation
  • gist: compressed representation in a multimodal inner language termed "Brainish"
  • weight: importance measure (real-valued)
  • intensity/mood: affective and motivational dimensions
    • Input/Output: Sensor and actuator interfacing is mapped into and out of the system via processor-accessible gists.

Two trees coordinate content flow:

  • UpTree: A binary competition tree through which LTM processors submit chunks to vie for placement in STM. Selection is governed either deterministically (max-value chunk) or probabilistically, with the probability proportional to function f(chunk)f(\text{chunk}) (typically additive in intensity and mood).
  • DownTree: A broadcast tree, ensuring that the chunk selected into STM is instantaneously disseminated to all LTM processors.
  • Links: Over time, bi-directional connections between LTM processors permit direct unconscious communications, bypassing STM and supporting automatization and emergent modularity.

2. Relationship to Brain Models and Workspace Theories

The CTM is a minimalist, distributed instantiation of the global workspace paradigm, diverging from brain models and extended GWT in several explicit ways (Blum et al., 2023, Blum et al., 2020):

  • Eliminates the central executive, rendering all competition, learning, and access functionalities as emergent properties of distributed tournament mechanisms.
  • Restricts STM to a single chunk for maximal mathematical simplicity; this reflects a computational focus on resource constraints and competitive selection for conscious attention.
  • Formally defines chunk representation, inter-processor communication, and competition function, establishing predictable access probabilities.
  • Models phenomena such as blindsight, inattentional blindness, phantom limb, and altered states by blocking, redirecting, or modulating access to STM, and adjusting processor linkage.
Feature Baars' GWT CTM
Central Executive Yes No
Modularity Implicit Explicit, modular
Chunk Definition Vague Formal 6-tuple
STM Capacity Multiple Single chunk
Communication Unspecified Tournament-based

3. Operational Dynamics and Mechanisms

At each system tick:

  1. All LTM processors self-assess their latest chunk, using predictive feedback and local learning algorithms (most notably the Sleeping Experts Algorithm).
  2. Chunks are submitted for competition in the UpTree. At each node, a competition function ff determines which chunk advances; in probabilistic settings, local winner selection uses:

P(winner=Ci)=f(Ci)f(C1)+f(C2)P(\text{winner} = C_i) = \frac{f(C_i)}{f(C_1) + f(C_2)}

  1. The final selected chunk is placed in STM and subsequently broadcast to all processors via the DownTree.
  2. Feedback ensues, updating processor weights and intensities, refining future competition.

Self-monitoring, adaptive prediction, and continuous learning are intrinsic to every processor, operationalized by repeated cycles: Predictive_Dynamics=Prediction+Feedback+Learning\text{Predictive\_Dynamics} = \text{Prediction} + \text{Feedback} + \text{Learning}

Processor specialization and link emergence support automatization, modularization, and unconsciously routed processes. The model-of-the-world (MotW) processor subsystem explicitly builds and maintains labeled, multimodal internal models for self, environment, and others, providing the foundation for self-consciousness and metacognition (Cui et al., 22 Oct 2024).

4. Consciousness, Self-Consciousness, and the Feeling of Experience

The CTM formalizes several relevant definitions:

  • Conscious Awareness: Reception by all LTM processors of the broadcasted STM chunk.
  • Stream of Consciousness: Sequence of chunks entering STM across time.
  • Self-Consciousness: Chunk in STM produced by the MotW processor subsystem, reflecting internal modeling and tagged as referring to the system's own state.

Brainish, the internal representational language, is sufficiently expressive to encode percepts, thoughts, instructions, and multimodal experiences. Model-of-the-world, inner speech, inner vision, and related processors decode and integrate these tagged gists, supporting a plausible account of sensation, agency, and the phenomenology of consciousness.

Phenomena such as illusions (e.g., Zöllner effect, phantom limb), disorders, and altered states are mapped onto MotW functional attributes, adjusting the relative influence of modeling, sensation, knowledgement, and feedback (Cui et al., 22 Oct 2024, Blum et al., 2021).

5. Applications: Explanatory Scope and AGI Architecture

The CTM framework explicates:

  • Attention, distraction, perceptual disorders (e.g., blindsight, change blindness) as explicit consequences of chunk competition and access modulation.
  • Dreams: Dynamics of sleep processors, dream creators, and absence of sensory input produce vivid inner movies using Brainish-encoded gists.
  • Free will: The system’s deliberative processes (modulated via predictive feedback, chunk selection, utility assessment) give rise to the feeling of volition, subject to resource and architectural constraints.
  • Ethical and empathic AI: Model-based inner labeling and response mechanisms support situated world modeling and potentially empathic capabilities in AGI (Blum et al., 2023).

The model is scalable, modular, and compatible with multi-agent integrations: millions of specialized agents can operate as LTM processors, collectively handling large parameter spaces and diverse reasoning, thus offering a novel decentralized architecture for AGI.

6. Mathematical Formalizations and Key Algorithms

The CTM’s competitive selection is optimal under additive selection functions; probabilities of conscious access are independent of processor position: Pr{chunkp wins at t}=f(chunkp,t,0)pf(chunkp,t,0)\Pr\{\text{chunk}_p \text{ wins at } t\} = \frac{f(\text{chunk}_{p,t,0})}{\sum_{p'} f(\text{chunk}_{p',t,0})} Tournament depth is logarithmic in the number of processors: Tournament Steps=log2N\text{Tournament Steps} = \log_2 N

MotW processor operations are formalized via:

  • Modeling function:

[inner_world, outer_world]=M(S,K)=asS+akK[\text{inner\_world},\ \text{outer\_world}] = M(S, K) = a_s S + a_k K

  • Gist/Instruction generator function:

[Gists]=G(S,I,K)[\text{Gists}] = G(S, I, K)

  • Value function:

V(S,G,K,t)=W(1)C(gist)+CuriositytV(S, G, K, t) = W \cdot (-1)^{C(\text{gist})} + \text{Curiosity} - t

7. Limitations, Implications, and Theoretical Significance

While the CTM is not physiologically precise and abstracts away from substrate constraints, its mathematical clarity, operational definitions, and explanatory reach constitute its principal contributions. It provides mechanisms for both access and phenomenological consciousness, supports functionalist/illusionist perspectives, and serves as a platform for rigorous debate and empirical confrontation regarding machine consciousness (Blum et al., 2021, Blum et al., 2020, Blum et al., 25 Mar 2024).

Limitations include absence of direct empirical testability, lack of a formal definition of "feeling," and omission of biological complexity. Nonetheless, the unification of computational, neuroscientific, and psychological insights positions CTM as a foundational architecture for further research into consciousness and AGI.


Summary Table: CTM Key Mechanisms

Mechanism Description Theoretical Principle
STM Stage Holds & broadcasts one chunk per tick Working memory, global access
LTM Processors Massive parallel, self-improving Modular AI agents, parallelism
Tournament Competition Selects conscious content probabilistically Self-organization, value selection
Sleeping Experts Local learning, prediction, feedback No teacher, continual adaptation
Processor Links Direct unconscious communication Emergent modularity
MotW Processor Labeled modeling of self/world/others Planning, indirect perception

The CTM offers a mathematically explicit, modular, and scalable model of consciousness. Its distributed, competition-based architecture explicates conscious access, subjective experience, and collective intelligence, constituting a rigorous foundation for artificial consciousness and the development of AGI.

Forward Email Streamline Icon: https://streamlinehq.com

Follow Topic

Get notified by email when new papers are published related to Conscious Turing Machine (CTM).