Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

CONFLATOR: Incorporating Switching Point based Rotatory Positional Encodings for Code-Mixed Language Modeling (2309.05270v2)

Published 11 Sep 2023 in cs.CL and cs.LG

Abstract: The mixing of two or more languages is called Code-Mixing (CM). CM is a social norm in multilingual societies. Neural LLMs (NLMs) like transformers have been effective on many NLP tasks. However, NLM for CM is an under-explored area. Though transformers are capable and powerful, they cannot always encode positional information since they are non-recurrent. Therefore, to enrich word information and incorporate positional information, positional encoding is defined. We hypothesize that Switching Points (SPs), i.e., junctions in the text where the language switches (L1 -> L2 or L2 -> L1), pose a challenge for CM LLMs (LMs), and hence give special emphasis to SPs in the modeling process. We experiment with several positional encoding mechanisms and show that rotatory positional encodings along with switching point information yield the best results. We introduce CONFLATOR: a neural LLMing approach for code-mixed languages. CONFLATOR tries to learn to emphasize switching points using smarter positional encoding, both at unigram and bigram levels. CONFLATOR outperforms the state-of-the-art on two tasks based on code-mixed Hindi and English (Hinglish): (i) sentiment analysis and (ii) machine translation.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Mohsin Ali (8 papers)
  2. Kandukuri Sai Teja (2 papers)
  3. Neeharika Gupta (1 paper)
  4. Parth Patwa (28 papers)
  5. Anubhab Chatterjee (1 paper)
  6. Vinija Jain (43 papers)
  7. Aman Chadha (110 papers)
  8. Amitava Das (45 papers)

Summary

We haven't generated a summary for this paper yet.