Dice Question Streamline Icon: https://streamlinehq.com

Incorporating the human perspective of attention into AI models

Develop AI modeling frameworks that explicitly integrate the human perspective of attention—particularly intentional control and joint attention—into Transformer-based systems, so that attention functions as an agency-related mechanism rather than solely computing pairwise relationships within input sequences.

Information Square Streamline Icon: https://streamlinehq.com

Background

The authors argue that, unlike human attention which serves as an explicit component of agency and supports joint attention in social cooperation, Transformer-based models primarily use attention to compute relationships among sequence elements and do not encode intentional or joint-attentional mechanisms.

They note that recent multi-agent algorithms based on Transformers do not explicitly incorporate joint attention, motivating the need for approaches that bring human agency-related attention constructs into AI systems.

References

As a result, it remains an open question how we can more effectively incorporate the human perspective of attention into AI models.

From Cognition to Computation: A Comparative Review of Human Attention and Transformer Architectures (2407.01548 - Zhao et al., 25 Apr 2024) in Section 4.4 (How can attention be formulated as an explicit component of agency?)