Geometric Structure of State-Space and Hybrid Transformer–SSM Models
Characterize the geometric structure of state-space models such as Mamba and hybrid transformer–state-space-model architectures, determining the presence and form of uncertainty-representation signatures (e.g., value manifolds, key orthogonality, attention focusing) and developing appropriate extraction methods if required.
Sponsor
References
Other architectures - notably state-space models such as Mamba and hybrid transformer----SSM designs - may require specialized extraction methods, and their geometric structure remains an open question.
— Geometric Scaling of Bayesian Inference in LLMs
(2512.23752 - Aggarwal et al., 27 Dec 2025) in Discussion — Limitations and Future Directions — Architectural coverage