Alignment of Transformer attention representations with human agency-based importance
Determine whether the token representations produced by the self-attention mechanism in Transformer architectures align with human agency-based judgments of importance, specifically whether attention-derived representations correspond to what individuals intentionally deem important given their beliefs, goals, and intentions.
References
It remains unclear whether the representations output from the attention mechanism coincides with what one believes to be important from an agency perspective.
                — From Cognition to Computation: A Comparative Review of Human Attention and Transformer Architectures
                
                (2407.01548 - Zhao et al., 25 Apr 2024) in Subsubsection 3.2.3 (Intentional nature)