Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Adapting Sequence to Sequence models for Text Normalization in Social Media (1904.06100v1)

Published 12 Apr 2019 in cs.CL, cs.AI, and cs.LG

Abstract: Social media offer an abundant source of valuable raw data, however informal writing can quickly become a bottleneck for many NLP tasks. Off-the-shelf tools are usually trained on formal text and cannot explicitly handle noise found in short online posts. Moreover, the variety of frequently occurring linguistic variations presents several challenges, even for humans who might not be able to comprehend the meaning of such posts, especially when they contain slang and abbreviations. Text Normalization aims to transform online user-generated text to a canonical form. Current text normalization systems rely on string or phonetic similarity and classification models that work on a local fashion. We argue that processing contextual information is crucial for this task and introduce a social media text normalization hybrid word-character attention-based encoder-decoder model that can serve as a pre-processing step for NLP applications to adapt to noisy text in social media. Our character-based component is trained on synthetic adversarial examples that are designed to capture errors commonly found in online user-generated text. Experiments show that our model surpasses neural architectures designed for text normalization and achieves comparable performance with state-of-the-art related work.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Ismini Lourentzou (27 papers)
  2. Kabir Manghnani (3 papers)
  3. ChengXiang Zhai (64 papers)
Citations (33)

Summary

We haven't generated a summary for this paper yet.