2000 character limit reached
Improving Code-switching Language Modeling with Artificially Generated Texts using Cycle-consistent Adversarial Networks (2112.06327v1)
Published 12 Dec 2021 in cs.CL
Abstract: This paper presents our latest effort on improving Code-switching LLMs that suffer from data scarcity. We investigate methods to augment Code-switching training text data by artificially generating them. Concretely, we propose a cycle-consistent adversarial networks based framework to transfer monolingual text into Code-switching text, considering Code-switching as a speaking style. Our experimental results on the SEAME corpus show that utilising artificially generated Code-switching text data improves consistently the LLM as well as the automatic speech recognition performance.