2000 character limit reached
Integrating Contrastive Learning into a Multitask Transformer Model for Effective Domain Adaptation (2310.04703v1)
Published 7 Oct 2023 in cs.CL, cs.HC, and cs.LG
Abstract: While speech emotion recognition (SER) research has made significant progress, achieving generalization across various corpora continues to pose a problem. We propose a novel domain adaptation technique that embodies a multitask framework with SER as the primary task, and contrastive learning and information maximisation loss as auxiliary tasks, underpinned by fine-tuning of transformers pre-trained on LLMs. Empirical results obtained through experiments on well-established datasets like IEMOCAP and MSP-IMPROV, illustrate that our proposed model achieves state-of-the-art performance in SER within cross-corpus scenarios.