Geometric Structure of State-Space and Hybrid Transformer–SSM Models

Characterize the geometric structure of state-space models such as Mamba and hybrid transformer–state-space-model architectures, determining the presence and form of uncertainty-representation signatures (e.g., value manifolds, key orthogonality, attention focusing) and developing appropriate extraction methods if required.

Background

The paper primarily studies dense transformers with multi-head attention, grouped-query attention, and sliding-window/MoE variants. It notes that other architectures—especially state-space models (SSMs) and hybrid transformer–SSM designs—may differ substantially and could require specialized methods to extract geometric signatures.

Since the core claims hinge on the persistence of a geometric substrate across architectures, establishing whether SSMs and hybrids possess analogous manifolds and frames, and how to robustly measure them, remains an open question relevant to broadening the geometric framework beyond standard transformer designs.

References

Other architectures - notably state-space models such as Mamba and hybrid transformer----SSM designs - may require specialized extraction methods, and their geometric structure remains an open question.

Geometric Scaling of Bayesian Inference in LLMs  (2512.23752 - Aggarwal et al., 27 Dec 2025) in Discussion — Limitations and Future Directions — Architectural coverage