Structural analysis of an all-purpose question answering model (2104.06045v1)
Abstract: Attention is a key component of the now ubiquitous pre-trained LLMs. By learning to focus on relevant pieces of information, these Transformer-based architectures have proven capable of tackling several tasks at once and sometimes even surpass their single-task counterparts. To better understand this phenomenon, we conduct a structural analysis of a new all-purpose question answering model that we introduce. Surprisingly, this model retains single-task performance even in the absence of a strong transfer effect between tasks. Through attention head importance scoring, we observe that attention heads specialize in a particular task and that some heads are more conducive to learning than others in both the multi-task and single-task settings.
- Vincent Micheli (8 papers)
- Quentin Heinrich (3 papers)
- François Fleuret (78 papers)
- Wacim Belblidia (3 papers)