Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Improving Code-switching Language Modeling with Artificially Generated Texts using Cycle-consistent Adversarial Networks (2112.06327v1)

Published 12 Dec 2021 in cs.CL

Abstract: This paper presents our latest effort on improving Code-switching LLMs that suffer from data scarcity. We investigate methods to augment Code-switching training text data by artificially generating them. Concretely, we propose a cycle-consistent adversarial networks based framework to transfer monolingual text into Code-switching text, considering Code-switching as a speaking style. Our experimental results on the SEAME corpus show that utilising artificially generated Code-switching text data improves consistently the LLM as well as the automatic speech recognition performance.

Citations (11)

Summary

We haven't generated a summary for this paper yet.