Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Multilingual Lexical Simplification via Paraphrase Generation (2307.15286v1)

Published 28 Jul 2023 in cs.CL

Abstract: Lexical simplification (LS) methods based on pretrained LLMs have made remarkable progress, generating potential substitutes for a complex word through analysis of its contextual surroundings. However, these methods require separate pretrained models for different languages and disregard the preservation of sentence meaning. In this paper, we propose a novel multilingual LS method via paraphrase generation, as paraphrases provide diversity in word selection while preserving the sentence's meaning. We regard paraphrasing as a zero-shot translation task within multilingual neural machine translation that supports hundreds of languages. After feeding the input sentence into the encoder of paraphrase modeling, we generate the substitutes based on a novel decoding strategy that concentrates solely on the lexical variations of the complex word. Experimental results demonstrate that our approach surpasses BERT-based methods and zero-shot GPT3-based method significantly on English, Spanish, and Portuguese.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Kang Liu (207 papers)
  2. Jipeng Qiang (22 papers)
  3. Yun Li (154 papers)
  4. Yunhao Yuan (18 papers)
  5. Yi Zhu (233 papers)
  6. Kaixun Hua (5 papers)
Citations (2)