Training attention mechanisms for global workspace selection and sequencing
Develop training procedures for the state-dependent attention mechanism in global workspace architectures that can select among module inputs to the workspace and reliably implement sequences of attentional operations needed to control extended, functional interactions among modules.
References
However, this work is a "roadmap" to a possible implementation, rather than a working system. It faces a substantial open question about how the attention mechanism could be trained to select among the potential inputs to the workspace, and especially how this could achieve the sequences of operations of attention needed to control extended, functional sequences of operations by relevant modules.
— Consciousness in Artificial Intelligence: Insights from the Science of Consciousness
(2308.08708 - Butlin et al., 2023) in Section 3.1.2, Implementing GWT