Expressivity of Softmax Transformers and relation to UHAT
Characterize the class of formal languages recognized by SoftMax Attention Transformers with position embeddings and determine whether this class subsumes the languages recognized by Unique Hard Attention Transformers with position embeddings.
References
For example, we do not know where the expressivity of softmax transformers exactly lies (e.g. do they subsume UHATs?).
                — The Role of Logic and Automata in Understanding Transformers
                
                (2509.24024 - Lin et al., 28 Sep 2025) in Section 6, Limitations of UHATs and AHATs (Limitation 1: Soft attention vs. Hard attention)