Interaction of Positional Embeddings, Local Attention, and Geometric Formation
Characterize the interaction among positional embeddings, local attention kernels (including sliding-window attention), and the formation of the low-dimensional geometric substrate (e.g., entropy-ordered value manifolds) in transformer language models, with particular attention to sliding-window and hybrid transformer–state-space-model architectures.
Sponsor
References
Likewise, the interaction between positional embeddings, local attention kernels, and geometric formation remains an open problem, especially in sliding-window or hybrid transformer----SSM architectures.
— Geometric Scaling of Bayesian Inference in LLMs
(2512.23752 - Aggarwal et al., 27 Dec 2025) in Analysis and Key Findings — Robustness and Limitations — Open representational questions